Announcement_6

📢 Our paper, Exploring the Impact of Activation Functions in Training Neural ODEs, has been accepted to ICLR2025 as an oral presentation (1.8% acceptance rate)!


🔍 Key Highlights:

  • We establish the global convergence of neural ODEs by analyzing their training dynamics.
  • Smoothness & nonlinearity of activation functions are the secret sauce—not only ensuring convergence but also accelerating training.
  • Surprisingly, for large-scale neural ODEs, fixed-step solvers can be more efficient than adaptive solvers!

Looking forward to presenting and discussing these insights in Singapore! 🚀