Safe Grid Search with Optimal Complexity

Probabilités et Statistique

Lieu: 
Salle séminaire M3-324
Orateur: 
Eugène Ndiaye
Affiliation: 
Télécom ParisTech
Dates: 
Mercredi, 17 Octobre, 2018 - 10:30 - 11:30
Résumé: 

Popular machine learning estimators involve regularization parameters that can be challenging to tune. Standard strategies consist in computing by grid search, a discretized approximation of the entire solution path. Unfortunately, the size of the discretized grid can be exponential in the dimension of the problem (cf complexity of Lars). We revisit the techniques of approximating the regularization  path up to predefined tolerance in a unified framework and show that its complexity is $O(1/\sqrt[d]{\epsilon})$ for uniformly convex loss of order $d>0$ and $O(1/\sqrt{\epsilon})$ for Generalized Self-Concordant functions. This includes examples such as least-squares and logistic loss. Finally, we leverage our technique to provide refined bounds on the validation error and provide practical algorithms for hyperparameter tuning with global convergence guarantee at any desired accuracy $\epsilon_v$ on the validation set.