Tianxiang Gao

Assistant Professor @ DePaul

adam_prof_pic.jpg

Jarvis College of CDM 712

243 South Wabash Avenue

Chicago, IL 60604

Welcome! I’m Tianxiang (天翔), but feel free to call me Adam—I’m happy with either! Since July 2024, I’ve been an Assistant Professor in the School of Computing at DePaul University.

🎓Before joining DePaul, I earned my Ph.D. in Computer Science with a co-major in Applied Mathematics from Iowa State University, where I was co-advised by Dr. Hongyang Gao, Dr. Hailiang Liu, and Dr. Jia (Kevin) Liu. My academic journey began with a bachelor’s degree in Mechanical Engineering from Yantai University.

Research

💡My research focuses on deep learning theory, generative AI, and graph representation learning. I enjoy exploring fundamental problems that mysteriously arise from real-world applications and offering insightful guidelines to advance practice.

🌟Openings!

  • I am looking for multiple Research Assistants (RAs) and interns for Spring 2026 and Summer 2026 to work on projects in deep learning theory, AI for Biomedicine, graph machine learning.

If you are interested, please send your CV, transcripts, publications (if available), and GRE/TOEFL scores (if available) to tgao9@depaul.edu.

news

Dec 15, 2025 🎉 I’m happy to share that our research initiatives have been selected for and awarded the prestigious Graduate Research Assistant Program (GRAP) Award from DePaul CDM! The awards support student research in the following areas:
  • Efficient Transformer attention architectures
  • Scalable graph machine learning
The GRAP awards provide tuition waivers and research assistant stipends for the supported students.
Dec 01, 2025 🎉 Our project, “Understanding Neural Scaling Laws via Feature Learning Dynamics”, has received a URC Competitive Research Grant from DePaul University, running from January 2026 to June 2027. Grateful for the support of foundational deep learning theory research.
Mar 20, 2025 🎉 Exciting news: Our new course, CSC 594: Deep Generative Models at DePaul University, is supported by Google Cloud Education Credits! Grateful for Google Cloud’s support! 🚀
Jan 22, 2025 📢 Our paper, Exploring the Impact of Activation Functions in Training Neural ODEs, has been accepted to ICLR2025 as an oral presentation (1.8% acceptance rate)!
🔍 Key Highlights:
  • We establish the global convergence of neural ODEs by analyzing their training dynamics.
  • Smoothness & nonlinearity of activation functions are the secret sauce—not only ensuring convergence but also accelerating training.
  • Surprisingly, for large-scale neural ODEs, fixed-step solvers can be more efficient than adaptive solvers!
Looking forward to presenting and discussing these insights in Singapore!
Dec 21, 2024 🎉 I’m pleased to share that our research projects have been selected for and received the prestigious Graduate Research Assistant Program (GRAP) Award from DePaul CDM! The awards support student research across the following directions:
  • Scalable graph machine learning
  • Efficient Transformer attention architectures
  • Feature learning theory and scaling laws
The GRAP awards include tuition waivers and research assistant stipends for the supported students.

selected publications

  1. ICLR 2025
    Global Convergence in Neural ODEs: Impact of Activation Functions
    Tianxiang Gao, Siyuan Sun, Hailiang Liu, and 1 more author
    In the 13th International Conference on Learning Representations (ICLR), 2025
    Oral Presentation (1.8% Acceptance Rate)
  2. NeurIPS 2023
    Wide neural networks as gaussian processes: Lessons from deep equilibrium models
    Tianxiang Gao, Xiaokai Huo, Hailiang Liu, and 1 more author
    In the 36th Advances in Neural Information Processing Systems (NeruIPS), 2023
  3. ICLR 2022
    A global convergence theory for deep implicit networks via over-parameterization
    Tianxiang Gao, Hailiang Liu, Jia Liu, and 2 more authors
    In the 10th International Conference on Learning Representations (ICLR), 2022