Statistics Seminar Speaker: Wei Biao Wu, 5/2/2018

Event Layout

Wednesday May 02 2018

Statistics Seminar Speaker: Wei Biao Wu, 5/2/2018

4:15pm @ G01 Biotechnology

The Statistics Seminar speaker for Wednesday, May 2, 2018 is Wei Biao Wu, a professor within the department of statistics at the University of Chicago. According to his website: "I am establishing a framework and a systematic theory for high-dimensional inference under dependence, and developing necessary tools so that the dependence can be accounted for. In particular, I am working on model selection, covariance matrix estimation, regression, mean vector estimation, and multiple testing problems for data with dependence. On the probabilistic side, I am interested in deviation and concentration inequalities for dependent random variables which may not have exponential moments. I am also investigating the deep Gaussian approximation problem."

Talk: Error bounds for statistical learning for time dependent data

Abstract: Classical statistical learning theory primarily concerns independent data. In comparison, it has been much less investigated for time dependent data, which are commonly encountered in economics, engineering, finance, geography, physics and other fields. In this talk, we focus on concentration inequalities for suprema of empirical processes which plays a fundamental role in the statistical learning theory. We derive a Gaussian approximation and an upper bound for the tail probability of the suprema under conditions on the size of the function class, the sample size, temporal dependence and the moment conditions of the underlying time series. Due to the dependence and heavy-tailness, our tail probability bound is substantially different from those classical exponential bounds obtained under the independence assumption in that it involves an extra polynomial decaying term. We allow both short- and long-range dependent processes, where the long-range dependence case has never been previously explored. We showed our tail probability inequality is sharp up to a multiplicative constant. These bounds work as theoretical guarantees for statistical learning applications under dependence. This work is joint with Likai Chen.

PDF iconWu - Flyer.pdf