ANR PRC MAD (2025-2029) id: ANR-24-CE23-1529
Opportunities Ahead!
Stay tuned for multiple positions opening soon (M2 internships, 3-years PhD funding & postdocs) in Nice and Toulouse!
Presentation
Automatic Differentiation (AD) is pivotal in efficiently and accurately calculating derivatives of functions, crucial for optimizing mathematical models in gradient-based optimization tasks. AD operates by breaking down functions, expressed as computer programs, into basic operations and applying the chain rule for derivative computation. This stands in contrast to symbolic differentiation and numerical differentiation. However, from a mathematical standpoint, AD faces challenges with nonsmooth functions due to its foundational reliance on the chain rule. Convergence issues may emerge with iterative methods involved in the function being differentiated. Moreover, when AD interacts with parametric integrals, difficulties, particularly in approximation, can arise. Project MAD aims to reconcile modern AD applications in machine learning pipelines with mathematical assurances for its correctness. The project is structured around three workpackages: WP1 delves into proving guarantees for the application of AD to iterative algorithms (unrolling), WP2 explores AD in bilevel optimization, and WP3 examines the interplay between Monte Carlo methods and AD usage.
Members
Permanent members
Laboratoire J. A. Dieudonné, Université Côte d'Azur
- Jean-Baptiste Caillau
- Yassine Laguel
- Samuel Vaiter (PI & Local-PI)
Toulouse School of Economics
- Jerome Bolte
- Stéphane Gadat
- Edouard Pauwels (Local-PI)
Institut de Mathématiques de Toulouse, Université Paul Sabatier
- Gersende Fort (Local-PI)
- Franck Iutzeler