See my Google Scholar profile for bibliometric data.
2025
📄
Bilevel gradient methods and Morse parametric qualification
Jérôme Bolte, Tung Lê, Edouard Pauwels, Samuel Vaiter.
Preprint.
31pp..
2025.
📄
Learning Theory for Kernel Bilevel Optimization
Farès El Khoury, Edouard Pauwels, Samuel Vaiter, Michael Arbel.
Preprint.
36pp..
2025.
2024
📄
CHANI: Correlation-based Hawkes Aggregation of Neurons with bio-Inspiration
Sophie Jaffard, Samuel Vaiter, Patricia Reynaud-Bouret.
Preprint.
50pp..
2024.
📄
Geometric and computational hardness of bilevel programming
Jérôme Bolte, Tung Lê, Edouard Pauwels, Samuel Vaiter.
Preprint.
32pp..
2024.
📄
Risk Estimate under a Nonstationary Autoregressive Model for Data-Driven Reproduction Number Estimation
Barbara Pascal, Samuel Vaiter.
Preprint.
29pp..
2024.
📄
Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs
Matthieu Cordonnier, Nicolas Keriven, Nicolas Tremblay, Samuel Vaiter.
Journal of Machine Learning Research
25(406):1−49.
2024.
📄
A theory of optimal convex regularization for low-dimensional recovery
Yann Traonmilin, Rémi Gribonval, Samuel Vaiter.
Information and Inference: A Journal of the IMA
13(2):66pp..
2024.
📄
Gradient Scarcity with Bilevel Optimization for Graph Learning
Hashem Ghanem, Samuel Vaiter, Nicolas Keriven.
Transactions on Machine Learning Research
:22pp..
2024. Featured Certification.
📄
Model identification and local linear convergence of coordinate descent
Quentin Klopfenstein, Quentin Bertrand, Alexandre Gramfort, Joseph Salmon, Samuel Vaiter.
Optimization Letters
18:135—154.
2024.
📄
How to compute Hessian-vector products?
Mathieu Dagréou, Pierre Ablin, Samuel Vaiter, Thomas Moreau.
International Conference on Learning Representations (ICLR) Blogposts Track.
2024.
📄
Derivatives of Stochastic Gradient Descent in parametric optimization
Franck Iutzeler, Edouard Pauwels, Samuel Vaiter.
Advances in Neural Information Processing Systems (NeurIPS).
2024.
📄
Provable local learning rule by expert aggregation for a Hawkes network
Sophie Jaffard, Samuel Vaiter, Alexandre Muzy, Patricia Reynaud-Bouret.
International Conference on Artificial Intelligence and Statistics (AISTATS).
2024.
📄
A Near-Optimal Algorithm for Bilevel Empirical Risk Minimization
Mathieu Dagréou, Thomas Moreau, Samuel Vaiter, Pierre Ablin.
International Conference on Artificial Intelligence and Statistics (AISTATS).
2024.
2023
📄
Borne inférieure de compléxité et algorithme quasi-optimal pour la minimisation de risque empirique bi-niveaux
Mathieu Dagréou, Thomas Moreau, Samuel Vaiter, Pierre Ablin.
Colloque Francophone de Traitement du Signal et des Images.
2023.
📄
What functions can Graph Neural Networks compute on random graphs? The role of Positional Encoding
Nicolas Keriven, Samuel Vaiter.
Advances in Neural Information Processing Systems (NeurIPS).
2023.
📄
One-step differentiation of iterative algorithms
Jérôme Bolte, Edouard Pauwels, Samuel Vaiter.
Advances in Neural Information Processing Systems (NeurIPS).
2023. (Spotlight paper).
📄
On the Robustness of Text Vectorizers
Rémi Catellier, Samuel Vaiter, Damien Garreau.
International Conference on Machine Learning (ICML).
2023.
📄
The Derivatives of Sinkhorn-Knopp Converge
Edouard Pauwels, Samuel Vaiter.
SIAM Journal on Optimization
33(3):1494–1517.
2023.
📄
The Geometry of Sparse Analysis Regularization
Xavier Dupuis, Samuel Vaiter.
SIAM Journal on Optimization
33(2):842–867.
2023.
📄
Supervised learning of analysis-sparsity priors with automatic differentiation
Hashem Ghanem, Joseph Salmon, Nicolas Keriven, Samuel Vaiter.
IEEE Signal Processing Letters
30:339–343.
2023.
📄
Convergence of Message Passing Graph Neural Networks with Generic Aggregation on Random Graphs
Matthieu Cordonnier, Nicolas Keriven, Nicolas Tremblay, Samuel Vaiter.
Graph Signal Processing workshop.
2023.
📄
Convergence of Message Passing Graph Neural Networks with Generic Aggregation on Random Graphs
Matthieu Cordonnier, Nicolas Keriven, Nicolas Tremblay, Samuel Vaiter.
Colloque Francophone de Traitement du Signal et des Images.
2023.
2022
📄
Automatic differentiation of nonsmooth iterative algorithms
Jérôme Bolte, Edouard Pauwels, Samuel Vaiter.
Advances in Neural Information Processing Systems (NeurIPS).
2022.
📄
A framework for bilevel optimization that enables stochastic and global variance reduction algorithms
Mathieu Dagréou, Pierre Ablin, Samuel Vaiter, Thomas Moreau.
Advances in Neural Information Processing Systems (NeurIPS).
2022. (Oral paper).
📄
Algorithmes stochastiques et réduction de variance grâce à un nouveau cadre pour l’optimisation bi-niveaux
Dagréou, Mathieu, Ablin, Pierre, Vaiter, Samuel, Moreau, Thomas.
Colloque Francophone de Traitement du Signal et des Images.
2022.
📄
Benchopt: Reproducible, efficient and collaborative optimization benchmarks
Thomas Moreau, Mathurin Massias, Alexandre Gramfort, Pierre Ablin, Pierre-Antoine Bannier, Benjamin Charlier, Mathieu Dagréou, Tom Dupré la Tour, Ghislain Durif, Cássio F. Dantas, Quentin Klopfenstein, Johan Larsson, En Lai, Tanguy Lefort, Malézieux Malezieu, Badr Moufad, Binh T. Nguyen, Alain Rakotomamonjy, Zaccharie Ramzie, Joseph Salmon, Samuel Vaiter.
Advances in Neural Information Processing Systems (NeurIPS).
2022.
📄
Implicit differentiation for fast hyperparameter selection in non-smooth convex learning
Quentin Bertrand, Quentin Klopfenstein, Mathurin Massias, Mathieu Blondel, Samuel Vaiter, Alexandre Gramfort, Joseph Salmon.
Journal of Machine Learning Research
23(149):1–43.
2022.
📄
Sparse and Smooth: improved guarantees for Spectral Clustering in the Dynamic Stochastic Block Model
Nicolas Keriven, Samuel Vaiter.
Electronic Journal of Statistics
16(1):1330–1366.
2022.
2021
📄
On the Universality of Graph Neural Networks on Large Random Graphs
Nicolas Keriven, Alberto Bietti, Samuel Vaiter.
Advances in Neural Information Processing Systems (NeurIPS).
2021.
📄
Linear Support Vector Regression with Linear Constraints
Quentin Klopfenstein, Samuel Vaiter.
Machine Learning
110(7):1939–1974.
2021.
📄
Automated data-driven selection of the hyperparameters for Total-Variation based texture segmentation
Barbara Pascal, Samuel Vaiter, Nelly Pustelnik, Patrice Abry.
Journal of Mathematical Imaging and Vision
63:923–952.
2021.
📄
Block based refitting in (ell_12) sparse regularisation
Charles-Alban Deledalle, Nicolas Papadakis, Joseph Salmon, Samuel Vaiter.
Journal of Mathematical Imaging and Vision
63(2):216–236.
2021.
📄
From optimization to algorithmic differentiation: a graph detour
Samuel Vaiter.
2021.
2020
📄
Dual Extrapolation for Sparse Generalized Linear Models
Mathurin Massias, Samuel Vaiter, Alexandre Gramfort, Joseph Salmon.
Journal of Machine Learning Research
21(234):1–33.
2020.
📄
Convergence and Stability of Graph Convolutional Networks on Large Random Graphs
Nicolas Keriven, Alberto Bietti, Samuel Vaiter.
Advances in Neural Information Processing Systems (NeurIPS).
2020. (Spotlight paper).
📄
Implicit differentiation of Lasso-type models for hyperparameter optimization
Quentin Bertrand, Quentin Klopfenstein, Mathieu Blondel, Samuel Vaiter, Alexandre Gramfort, Joseph Salmon.
International Conference on Machine Learning (ICML).
2020.
2019
📄
Refitting solutions promoted by (ell_12) sparse analysis regularization with block penalties
Charles-Alban Deledalle, Nicolas Papadakis, Joseph Salmon, Samuel Vaiter.
International Conference on Scale Space and Variational Methods in Computer Vision (SSVM).
2019.
📄
Exploiting regularity in sparse Generalized Linear Models
Mathurin Massias, Samuel Vaiter, Alexandre Gramfort, Joseph Salmon.
Signal Processing with Adaptive Sparse Structured Representations (SPARS).
2019.
📄
Maximal Solutions of Sparse Analysis Regularization
Abdessamad Barbara, Abderrahim Jourani, Samuel Vaiter.
Journal of Optimization Theory and Applications
180(2):371–396.
2019.
2018
📄
Is the 1-norm the best convex sparse regularization?
Yann Traonmilin, Samuel Vaiter, Rémi Gribonval.
iTWIST.
2018.
📄
Optimality of 1-norm regularization among weighted 1-norms for sparse recovery: a case study on how to find optimal regularizations
Yann Traonmilin, Samuel Vaiter.
New Computational Methods for Inverse Problems (NCMIP).
2018.
📄
Model Consistency of Partly Smooth Regularizers
Samuel Vaiter, Gabriel Peyré, Jalal Fadili.
IEEE Transactions on Information Theory
64(3):1725–1737.
2018.
2017
📄
Accelerated Alternating Descent Methods for Dykstra-like problems
Antonin Chambolle, Pauline Tan, Samuel Vaiter.
Journal of Mathematical Imaging and Vision
59(3):481–497.
2017.
📄
A Sharp Oracle Inequality for Graph-Slope
Pierre Bellec, Joseph Salmon, Samuel Vaiter.
Electronic Journal of Statistics
11(2):4851–4870.
2017.
📄
The Degrees of Freedom of Partly Smooth Regularizers
Samuel Vaiter, Charles-Alban Deledalle, Gabriel Peyré, Jalal Fadili, Charles Dossal.
Annals of the Institute of Statistical Mathematics
69(4):791–832.
2017.
📄
Characterizing the maximum parameter of the total-variation denoising through the pseudo-inverse of the divergence
Charles-Alban Deledalle, Nicolas Papadakis, Joseph Salmon, Samuel Vaiter.
Signal Processing with Adaptive Sparse Structured Representations (SPARS).
2017.
📄
CLEAR: Covariant LEAst-square Re-fitting with applications to image restoration
Charles-Alban Deledalle, Nicolas Papadakis, Joseph Salmon, Samuel Vaiter.
SIAM Journal on Imaging Sciences
10(1):243–284.
2017.
2015
📄
Model Selection with Low Complexity Priors
Samuel Vaiter, Mohammad Golbabaee, Jalal Fadili, Gabriel Peyré.
Information and Inference: A Journal of the IMA
4(3):230–287.
2015.
📄
Low Complexity Regularization of Linear Inverse Problems
Samuel Vaiter, Gabriel Peyré, Jalal Fadili.
2015.
2014
📄
Stein Unbiased GrAdient estimator of the Risk (SUGAR) for multiple parameter selection
Charles-Alban Deledalle, Samuel Vaiter, Jalal Fadili, Gabriel Peyré.
SIAM Journal on Imaging Sciences
7(4):2448–2487.
2014.
📄
Low Complexity Regularizations of Inverse Problems
Samuel Vaiter.
2014.
2013
📄
Local Behavior of Sparse Analysis Regularization: Applications to Risk Estimation
Samuel Vaiter, Charles-Alban Deledalle, Gabriel Peyré, Charles Dossal, Jalal Fadili.
Applied and Computational Harmonic Analysis
35(3):433–451.
2013.
📄
The degrees of freedom of the group Lasso for a general design
Samuel Vaiter, Gabriel Peyré, Jalal Fadili, Charles-Alban Deledalle, Charles Dossal.
Signal Processing with Adaptive Sparse Structured Representations (SPARS).
2013.
📄
Stable Recovery with Analysis Decomposable Priors
Jalal Fadili, Gabriel Peyré, Samuel Vaiter, Charles-Alban Deledalle, Joseph Salmon.
SAMPTA.
2013.
📄
Reconstruction Stable par Régularisation Décomposable Analyse
Jalal Fadili, Gabriel Peyré, Samuel Vaiter, Charles-Alban Deledalle, Joseph Salmon.
Colloque Francophone de Traitement du Signal et des Images.
2013.
📄
Robust Polyhedral Regularization
Samuel Vaiter, Gabriel Peyré, Jalal Fadili.
SAMPTA.
2013.
📄
Robustesse au bruit des régularisations polyhédrales
Samuel Vaiter, Gabriel Peyré, Jalal Fadili.
Colloque Francophone de Traitement du Signal et des Images.
2013.
📄
Robust Sparse Analysis Regularization
Samuel Vaiter, Gabriel Peyré, Charles Dossal, Jalal Fadili.
IEEE Transactions on Information Theory
59(4):2001–2016.
2013.
2012
📄
Unbiased Risk Estimation for Sparse Analysis Regularization
Charles-Alban Deledalle, Samuel Vaiter, Gabriel Peyré, Jalal Fadili, Charles Dossal.
International Conference on Image Processing (ICIP).
2012.
📄
Proximal Splitting Derivatives for Risk Estimation
Charles-Alban Deledalle, Samuel Vaiter, Gabriel Peyré, Jalal Fadili, Charles Dossal.
New Computational Methods for Inverse Problems (NCMIP).
2012.
📄
The Degrees of Freedom of the Group Lasso
Samuel Vaiter, Charles-Alban Deledalle, Gabriel Peyré, Jalal Fadili, Charles Dossal.
International Conference on Machine Learning (ICML) Workshop Sparsity, Dictionaries and Projections in Machine Learning and Signal Processing.
2012.
📄
Risk estimation for matrix recovery with spectral regularization
Charles-Alban Deledalle, Samuel Vaiter, Gabriel Peyré, Jalal Fadili, Charles Dossal.
International Conference on Machine Learning (ICML) Workshop Sparsity, Dictionaries and Projections in Machine Learning and Signal Processing.
2012.