The Statistics Seminar speaker for Wednesday, November 7, 2018, is Qiyang Han, an assistant professor of Statistics and Biostatistics at Rutgers University. He obtained a Ph.D. in Statistics from University of Washington in 2018. He is broadly interested in mathematical statistics and high dimensional probability. His current research is concentrated on abstract empirical process theory, and its applications to nonparametric function estimation (with a special focus on shape-restricted problems), Bayes nonparametrics, and high dimensional statistics.
Talk: Least squares estimation: beyond Gaussian regression models
Abstract: We study the convergence rate of the least squares estimator (LSE) in a regression model with possibly heavy-tailed errors. Despite its importance in practical applications, theoretical understanding of this problem has been limited. We first show that from a worst-case perspective, the convergence rate of the LSE in a general non-parametric regression model is given by the maximum of the Gaussian regression rate and the noise rate induced by the errors. In the more difficult statistical model where the errors only have a second moment, we further show that the sizes of the 'localized envelopes' of the model give a sharp interpolation for the convergence rate of the LSE between the worst-case rate and the (optimal) parametric rate. These results indicate both certain positive and negative aspects of the LSE as an estimation procedure in a heavy-tailed regression setting. The key technical innovation is a new multiplier inequality that sharply controls the size of the multiplier empirical process associated with the LSE, which also finds applications in shape-restricted and sparse linear regression problems.