Tianxiang Gao

Assistant Professor @ DePaul | Understanding feature learning in deep neural networks and applying generative AI to healthcare

adam_prof_pic.jpg

Jarvis College of CDM 712

243 South Wabash Avenue

Chicago, IL 60604

Welcome! I’m Tianxiang (天翔), but feel free to call me Adam—I’m happy with either! Since July 2024, I’ve been an Assistant Professor in the School of Computing at DePaul University. I am also currently a Visiting Scholar at the Institute for Mathematical and Statistical Innovation at the University of Chicago.

Before joining DePaul, I earned my Ph.D. in Computer Science with a co-major in Applied Mathematics from Iowa State University, where I was co-advised by Dr. Hongyang Gao, Dr. Hailiang Liu, and Dr. Jia (Kevin) Liu. My academic journey began with a bachelor’s degree in Mechanical Engineering from Yantai University.

Research

My research lies at the intersection of deep learning theory and AI for healthcare. I am particularly interested in fundamental questions that arise from real-world applications and in developing principled insights and guidelines that advance both understanding and practice.

On the theoretical side, my team and I study how modern neural networks learn useful representations during training at scale, particularly in deep architectures, with the goal of uncovering principles that govern their behavior and scalability.

On the applied side, we develop AI methods for healthcare, focusing on biomedical data and medical imaging, using generative AI techniques, including diffusion models and language models. Through these efforts, we aim to leverage AI as a powerful tool to advance biomedical research and improve our ability to understand and analyze complex healthcare data.

Opportunities to Collaborate

I am always interested in collaborating with motivated students and researchers who share an interest in deep learning theory and AI for healthcare. If you are interested in working with us, please feel free to reach out and send your CV, transcripts, publications (if available), and GRE/TOEFL scores (if available) to tgao9@depaul.edu or gaotx@uchicago.edu.

news

Feb 10, 2026 Pleased to share that I’v been appointed as a Visiting Research Member at the Institute for Mathematical and Statistical Innovation (IMSI) at the University of Chicago for Spring 2026. I will participate in IMSI’s long program on Theoretical Advances in Reinforcement Learning and Control. Looking forward to the discussions and collaborations ahead.
Dec 15, 2025 🎉 I’m happy to share that our research initiatives have been selected for and awarded the prestigious Graduate Research Assistant Program (GRAP) Award from DePaul CDM! The awards support student research in the following areas:
  • Efficient Transformer attention architectures
  • Scalable graph machine learning
The GRAP awards provide tuition waivers and research assistant stipends for the supported students.
Dec 01, 2025 🎉 Our project, “Understanding Neural Scaling Laws via Feature Learning Dynamics”, has received a URC Competitive Research Grant from DePaul University, running from January 2026 to June 2027. Grateful for the support of foundational deep learning theory research.
Mar 20, 2025 🎉 Exciting news: Our new course, CSC 594: Deep Generative Models at DePaul University, is supported by Google Cloud Education Credits! Grateful for Google Cloud’s support! 🚀
Jan 22, 2025 📢 Our paper, Exploring the Impact of Activation Functions in Training Neural ODEs, has been accepted to ICLR2025 as an oral presentation (1.8% acceptance rate)!
🔍 Key Highlights:
  • We establish the global convergence of neural ODEs by analyzing their training dynamics.
  • Smoothness & nonlinearity of activation functions are the secret sauce—not only ensuring convergence but also accelerating training.
  • Surprisingly, for large-scale neural ODEs, fixed-step solvers can be more efficient than adaptive solvers!
Looking forward to presenting and discussing these insights in Singapore!

selected publications

  1. ICLR 2025
    Global Convergence in Neural ODEs: Impact of Activation Functions
    Tianxiang Gao, Siyuan Sun, Hailiang Liu, and 1 more author
    In the 13th International Conference on Learning Representations (ICLR), 2025
    Oral Presentation (1.8% Acceptance Rate)
  2. NeurIPS 2023
    Wide neural networks as gaussian processes: Lessons from deep equilibrium models
    Tianxiang Gao, Xiaokai Huo, Hailiang Liu, and 1 more author
    In the 36th Advances in Neural Information Processing Systems (NeruIPS), 2023
  3. ICLR 2022
    A global convergence theory for deep implicit networks via over-parameterization
    Tianxiang Gao, Hailiang Liu, Jia Liu, and 2 more authors
    In the 10th International Conference on Learning Representations (ICLR), 2022