Title: On Bandwidth Selection in Empirical Risk Minimization
Abstract: I will first start by a short summary on bandwidth selection methods such as Cross-Validation or Lepski-type procedure explaining their strengths and weaknesses. Our contributions joined with J. Lederer and S. Loustau consist of a data-driven selection of the bandwidth for empirical risk minimization. One typically deals with this issue in pointwise estimation in the setting of heavy-tail noises with local M-estimators. To this end, we apply Lepski's method to these estimators to get optimal (minimax) results in pointwise estimation. Besides, we will explain how we can choose the robustness of M-estimators. In learning theory, many authors have recently investigated supervised and unsupervised learning with errors in variables. As a rule, such issues (viewed as an inverse problem) require to plug-in deconvolution kernels in the empirical risk (and then select a bandwidth) such as Hall and Lahiri  in quantile and moment estimation, Loustau and Marteau  in noisy discriminant analysis, Loustau  in noisy learning, Chichignoud and Loustau  in noisy clustering and Dattner, Reiß and Trabs  in quantile estimation. We will especially present the result of Chichignoud and Loustau  in noisy clustering where a new version of Lepski's method is provided to obtain excess risk bounds.
Refreshments will be served after the seminar in 1181 Comstock Hall.