Tianxiang Gao

Assistant Professor @ DePaul

adam_prof_pic.jpg

Jarvis College of CDM 712

243 South Wabash Avenue

Chicago, IL 60604

Welcome! I’m Tianxiang (天翔), but feel free to call me Adam—I’m happy with either! Since July 2024, I’ve been an Assistant Professor in the School of Computing at DePaul University.

🎓Before joining DePaul, I earned my Ph.D. in Computer Science with a co-major in Applied Mathematics from Iowa State University, where I was co-advised by Dr. Hongyang Gao, Dr. Hailiang Liu, and Dr. Jia (Kevin) Liu. My academic journey began with a bachelor’s degree in Mechanical Engineering from Yantai University.

Research

💡My research focuses on deep learning theory, generative AI, and graph representation learning. I enjoy exploring fundamental problems that mysteriously arise from real-world applications and offering insightful guidelines to advance practice.

🌟Immediate Openings!

  • I am recruiting Ph.D. students starting in Fall 2025 and Winter 2026.
  • I am looking for multiple Research Assistants (RAs) for Spring 2025 and Summer 2025 to work on projects in deep learning theory and graph representation learning.

If you are interested, please send your CV, transcripts, publications (if available), and GRE/TOEFL scores (if available) to t.gao@depaul.edu.

news

Jan 22, 2025 📢 Our paper, Exploring the Impact of Activation Functions in Training Neural ODEs, has been accepted to ICLR2025 as an oral presentation (1.8% acceptance rate)!
🔍 Key Highlights:
  • We establish the global convergence of neural ODEs by analyzing their training dynamics.
  • Smoothness & nonlinearity of activation functions are the secret sauce—not only ensuring convergence but also accelerating training.
  • Surprisingly, for large-scale neural ODEs, fixed-step solvers can be more efficient than adaptive solvers!
Looking forward to presenting and discussing these insights at ICLR 2025 in Singapore! 🚀
Dec 21, 2024 🎉 Excited to share that all my master’s students—Nishant Singh, Jagriti Suneja, and Mohammed Azeezulla—have each received prestigious Graduate Research Assistant Program (GRAP) Award from DePaul CDM! Congratulations to all of them for their hard work and dedication!
Oct 18, 2024 💡 Excited to share that I gave a talk on “Building Your Own Customized GPT Teaching Assistant for ‘Free’” at the AI in Teaching Symposium held by DePaul AI Institute! 📚✨ You can check out the recording here.📺
Oct 10, 2024 📢 I’m happy to be invited at the Math Colloquium hosted by the Department of Mathematical Sciences at DePaul University.
  • 🗓️ Date: Friday, November 1, 2024
  • ⏰ Time: 2:00 PM – 3:00 PM
  • 📍 Location: Arts & Letters Hall, Room 207
  • 🎙️ Topic: “Learnability in Infinite-Depth Neural Networks: Overparameterization and the Role of Gaussian Processes”
I look forward to sharing insights and sparking engaging discussions! See you there! 🚀
Sep 15, 2024 📢 Thrilled to announce my new AI course, Deep Generative Models, is officially approved! Debuting as CSC 594: Topics in Artificial Intelligence in Spring 2025, this course will cover:
  • Variational Autoencoders (VAEs)
  • Generative Adversarial Networks (GANs)
  • Autoregressive Models
  • Normalizing Flows
  • Energy-Based Models
  • Score-Based/Diffusion/Flow Matching Models

selected publications

  1. ICLR
    Exploring the Impact of Activation Functions in Training Neural ODEs
    Tianxiang Gao, Siyuan Sun, Hailiang Liu, and 1 more author
    In the 13th International Conference on Learning Representations (ICLR), 2025
    Oral Presentation (1.8% Acceptance Rate)
  2. NeurIPS
    Wide neural networks as gaussian processes: Lessons from deep equilibrium models
    Tianxiang Gao, Xiaokai Huo, Hailiang Liu, and 1 more author
    In the 36th Advances in Neural Information Processing Systems (NeruIPS), 2023
  3. ICLR
    A global convergence theory for deep implicit networks via over-parameterization
    Tianxiang Gao, Hailiang Liu, Jia Liu, and 2 more authors
    In the 10th International Conference on Learning Representations (ICLR), 2022