Publications
See my Google Scholar profile for bibliometric data.2025
📄 J. Bolte, T. Lê, E. Pauwels, S. Vaiter. Bilevel gradient methods and Morse parametric qualification. Preprint. 2025.📄 F. El Khoury, E. Pauwels, S. Vaiter, M. Arbel. Learning Theory for Kernel Bilevel Optimization. Preprint. 2025.
📄 J. Bolte, T. Lê, E. Pauwels, S. Vaiter. Geometric and computational hardness of bilevel programming. to appear in Mathematical Programming. 2025.
2024
📄 S. Jaffard, S. Vaiter, P. Reynaud-Bouret. CHANI: Correlation-based Hawkes Aggregation of Neurons with bio-Inspiration. Preprint. 2024.📄 B. Pascal, S. Vaiter. Risk Estimate under a Nonstationary Autoregressive Model for Data-Driven Reproduction Number Estimation. Preprint. 2024.
📄 M. Cordonnier, N. Keriven, N. Tremblay, S. Vaiter. Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs. Journal of Machine Learning Research. 2024.
📄 Y. Traonmilin, R. Gribonval, S. Vaiter. A theory of optimal convex regularization for low-dimensional recovery. Information and Inference: A Journal of the IMA. 2024.
📄 H. Ghanem, S. Vaiter, N. Keriven. Gradient Scarcity with Bilevel Optimization for Graph Learning. Transactions on Machine Learning Research. 2024.
📄 Q. Klopfenstein, Q. Bertrand, A. Gramfort, J. Salmon, S. Vaiter. Model identification and local linear convergence of coordinate descent. Optimization Letters. 2024.
📄 M. Dagréou, P. Ablin, S. Vaiter, T. Moreau. How to compute Hessian-vector products?. International Conference on Learning Representations (ICLR) Blogposts Track. 2024.
📄 F. Iutzeler, E. Pauwels, S. Vaiter. Derivatives of Stochastic Gradient Descent in parametric optimization. Advances in Neural Information Processing Systems (NeurIPS). 2024.
📄 S. Jaffard, S. Vaiter, A. Muzy, P. Reynaud-Bouret. Provable local learning rule by expert aggregation for a Hawkes network. International Conference on Artificial Intelligence and Statistics (AISTATS). 2024.
📄 M. Dagréou, T. Moreau, S. Vaiter, P. Ablin. A Near-Optimal Algorithm for Bilevel Empirical Risk Minimization. International Conference on Artificial Intelligence and Statistics (AISTATS). 2024.
2023
📄 M. Dagréou, T. Moreau, S. Vaiter, P. Ablin. Borne inférieure de compléxité et algorithme quasi-optimal pour la minimisation de risque empirique bi-niveaux. Colloque Francophone de Traitement du Signal et des Images. 2023.📄 N. Keriven, S. Vaiter. What functions can Graph Neural Networks compute on random graphs? The role of Positional Encoding. Advances in Neural Information Processing Systems (NeurIPS). 2023.
📄 J. Bolte, E. Pauwels, S. Vaiter. One-step differentiation of iterative algorithms. Advances in Neural Information Processing Systems (NeurIPS). 2023.
📄 R. Catellier, S. Vaiter, D. Garreau. On the Robustness of Text Vectorizers. International Conference on Machine Learning (ICML). 2023.
📄 E. Pauwels, S. Vaiter. The Derivatives of Sinkhorn-Knopp Converge. SIAM Journal on Optimization. 2023.
📄 X. Dupuis, S. Vaiter. The Geometry of Sparse Analysis Regularization. SIAM Journal on Optimization. 2023.
📄 H. Ghanem, J. Salmon, N. Keriven, S. Vaiter. Supervised learning of analysis-sparsity priors with automatic differentiation. IEEE Signal Processing Letters. 2023.
📄 M. Cordonnier, N. Keriven, N. Tremblay, S. Vaiter. Convergence of Message Passing Graph Neural Networks with Generic Aggregation on Random Graphs. Graph Signal Processing workshop. 2023.
📄 M. Cordonnier, N. Keriven, N. Tremblay, S. Vaiter. Convergence of Message Passing Graph Neural Networks with Generic Aggregation on Random Graphs. Colloque Francophone de Traitement du Signal et des Images. 2023.
2022
📄 J. Bolte, E. Pauwels, S. Vaiter. Automatic differentiation of nonsmooth iterative algorithms. Advances in Neural Information Processing Systems (NeurIPS). 2022.📄 M. Dagréou, P. Ablin, S. Vaiter, T. Moreau. A framework for bilevel optimization that enables stochastic and global variance reduction algorithms. Advances in Neural Information Processing Systems (NeurIPS). 2022.
📄 M. Dagr{\'e}ou, P. Ablin, S. Vaiter, T. Moreau. Algorithmes stochastiques et réduction de variance grâce à un nouveau cadre pour l’optimisation bi-niveaux. Colloque Francophone de Traitement du Signal et des Images. 2022.
📄 T. Moreau, M. Massias, A. Gramfort, P. Ablin, P. Bannier, B. Charlier, M. Dagréou, T. Dupré la Tour, G. Durif, C. F. Dantas, Q. Klopfenstein, J. Larsson, E. Lai, T. Lefort, M. Malezieu, B. Moufad, B. T. Nguyen, A. Rakotomamonjy, Z. Ramzie, J. Salmon, S. Vaiter. Benchopt: Reproducible, efficient and collaborative optimization benchmarks. Advances in Neural Information Processing Systems (NeurIPS). 2022.
📄 Q. Bertrand, Q. Klopfenstein, M. Massias, M. Blondel, S. Vaiter, A. Gramfort, J. Salmon. Implicit differentiation for fast hyperparameter selection in non-smooth convex learning. Journal of Machine Learning Research. 2022.
📄 N. Keriven, S. Vaiter. Sparse and Smooth: improved guarantees for Spectral Clustering in the Dynamic Stochastic Block Model. Electronic Journal of Statistics. 2022.
2021
📄 N. Keriven, A. Bietti, S. Vaiter. On the Universality of Graph Neural Networks on Large Random Graphs. Advances in Neural Information Processing Systems (NeurIPS). 2021.📄 Q. Klopfenstein, S. Vaiter. Linear Support Vector Regression with Linear Constraints. Machine Learning. 2021.
📄 B. Pascal, S. Vaiter, N. Pustelnik, P. Abry. Automated data-driven selection of the hyperparameters for Total-Variation based texture segmentation. Journal of Mathematical Imaging and Vision. 2021.
📄 C. Deledalle, N. Papadakis, J. Salmon, S. Vaiter. Block based refitting in \(\ell_{12}\) sparse regularisation. Journal of Mathematical Imaging and Vision. 2021.
📄 S. Vaiter. From optimization to algorithmic differentiation: a graph detour. PhD Thesis. 2021.
2020
📄 M. Massias, S. Vaiter, A. Gramfort, J. Salmon. Dual Extrapolation for Sparse Generalized Linear Models. Journal of Machine Learning Research. 2020.📄 N. Keriven, A. Bietti, S. Vaiter. Convergence and Stability of Graph Convolutional Networks on Large Random Graphs. Advances in Neural Information Processing Systems (NeurIPS). 2020.
📄 Q. Bertrand, Q. Klopfenstein, M. Blondel, S. Vaiter, A. Gramfort, J. Salmon. Implicit differentiation of Lasso-type models for hyperparameter optimization. International Conference on Machine Learning (ICML). 2020.
2019
📄 C. Deledalle, N. Papadakis, J. Salmon, S. Vaiter. Refitting solutions promoted by \(\ell_{12}\) sparse analysis regularization with block penalties. International Conference on Scale Space and Variational Methods in Computer Vision (SSVM). 2019.📄 M. Massias, S. Vaiter, A. Gramfort, J. Salmon. Exploiting regularity in sparse Generalized Linear Models. Signal Processing with Adaptive Sparse Structured Representations (SPARS). 2019.
📄 A. Barbara, A. Jourani, S. Vaiter. Maximal Solutions of Sparse Analysis Regularization. Journal of Optimization Theory and Applications. 2019.
2018
📄 Y. Traonmilin, S. Vaiter, R. Gribonval. Is the 1-norm the best convex sparse regularization?. iTWIST. 2018.📄 Y. Traonmilin, S. Vaiter. Optimality of 1-norm regularization among weighted 1-norms for sparse recovery: a case study on how to find optimal regularizations. New Computational Methods for Inverse Problems (NCMIP). 2018.
📄 S. Vaiter, G. Peyré, J. Fadili. Model Consistency of Partly Smooth Regularizers. IEEE Transactions on Information Theory. 2018.
2017
📄 A. Chambolle, P. Tan, S. Vaiter. Accelerated Alternating Descent Methods for Dykstra-like problems. Journal of Mathematical Imaging and Vision. 2017.📄 P. Bellec, J. Salmon, S. Vaiter. A Sharp Oracle Inequality for Graph-Slope. Electronic Journal of Statistics. 2017.
📄 S. Vaiter, C. Deledalle, G. Peyré, J. Fadili, C. Dossal. The Degrees of Freedom of Partly Smooth Regularizers. Annals of the Institute of Statistical Mathematics. 2017.
📄 C. Deledalle, N. Papadakis, J. Salmon, S. Vaiter. Characterizing the maximum parameter of the total-variation denoising through the pseudo-inverse of the divergence. Signal Processing with Adaptive Sparse Structured Representations (SPARS). 2017.
📄 C. Deledalle, N. Papadakis, J. Salmon, S. Vaiter. CLEAR: Covariant LEAst-square Re-fitting with applications to image restoration. SIAM Journal on Imaging Sciences. 2017.
2015
📄 S. Vaiter, M. Golbabaee, J. Fadili, G. Peyré. Model Selection with Low Complexity Priors. Information and Inference: A Journal of the IMA. 2015.📄 S. Vaiter, G. Peyré, J. Fadili. Low Complexity Regularization of Linear Inverse Problems. . 2015.
2014
📄 C. Deledalle, S. Vaiter, J. Fadili, G. Peyré. Stein Unbiased GrAdient estimator of the Risk (SUGAR) for multiple parameter selection. SIAM Journal on Imaging Sciences. 2014.📄 S. Vaiter. Low Complexity Regularizations of Inverse Problems. PhD Thesis. 2014.
2013
📄 S. Vaiter, C. Deledalle, G. Peyré, C. Dossal, J. Fadili. Local Behavior of Sparse Analysis Regularization: Applications to Risk Estimation. Applied and Computational Harmonic Analysis. 2013.📄 S. Vaiter, G. Peyré, J. Fadili, C. Deledalle, C. Dossal. The degrees of freedom of the group Lasso for a general design. Signal Processing with Adaptive Sparse Structured Representations (SPARS). 2013.
📄 J. Fadili, G. Peyré, S. Vaiter, C. Deledalle, J. Salmon. Stable Recovery with Analysis Decomposable Priors. SAMPTA. 2013.
📄 J. Fadili, G. Peyré, S. Vaiter, C. Deledalle, J. Salmon. Reconstruction Stable par Régularisation Décomposable Analyse. Colloque Francophone de Traitement du Signal et des Images. 2013.
📄 S. Vaiter, G. Peyré, J. Fadili. Robust Polyhedral Regularization. SAMPTA. 2013.
📄 S. Vaiter, G. Peyré, J. Fadili. Robustesse au bruit des régularisations polyhédrales. Colloque Francophone de Traitement du Signal et des Images. 2013.
📄 S. Vaiter, G. Peyré, C. Dossal, J. Fadili. Robust Sparse Analysis Regularization. IEEE Transactions on Information Theory. 2013.
2012
📄 C. Deledalle, S. Vaiter, G. Peyré, J. Fadili, C. Dossal. Unbiased Risk Estimation for Sparse Analysis Regularization. International Conference on Image Processing (ICIP). 2012.📄 C. Deledalle, S. Vaiter, G. Peyré, J. Fadili, C. Dossal. Proximal Splitting Derivatives for Risk Estimation. New Computational Methods for Inverse Problems (NCMIP). 2012.
📄 S. Vaiter, C. Deledalle, G. Peyré, J. Fadili, C. Dossal. The Degrees of Freedom of the Group Lasso. International Conference on Machine Learning (ICML) Workshop Sparsity, Dictionaries and Projections in Machine Learning and Signal Processing. 2012.
📄 C. Deledalle, S. Vaiter, G. Peyré, J. Fadili, C. Dossal. Risk estimation for matrix recovery with spectral regularization. International Conference on Machine Learning (ICML) Workshop Sparsity, Dictionaries and Projections in Machine Learning and Signal Processing. 2012.