Sabyasachi is an Associate Professor in the Statistics Department at University of Illinois at Urbana Champaign. His research interests are in Nonparametric Statistics, Statistical Signal Processing, Applied Probability and Statistical Information Theory. He obtained his Ph.D. in 2014 at Yale University and then was a Kruskal Instructor at University of Chicago until 2017.
Talk: Revisiting Total Variation Denoising: New Perspectives and Generalizations
Abstract: Total Variation Denoising (TVD) is a fundamental denoising/smoothing method. We will present a new local minmax/maxmin formula producing two estimators which sand-wich the univariate TVD estimator at every point. Operationally, this formula gives a local definition of TVD as a minmax/maxmin of a simple function of local averages. We will show that this minmax/maxmin formula is generalizeable and can be used to define other TVD like estimators. In particular, we will higher order polynomial versions of TVD which are defined pointwise lying between minmax and maxmin optimizations of penalized local polynomial regressions over intervals of different scales. These appear to be new nonparametric regression methods, different from usual Trend Filtering and any other existing method in the nonparametric regression toolbox. We call these estimators Minmax Trend Filtering (MTF). We will show how the proposed local definition of TVD/MTF estimator makes it tractable to bound pointwise estimation errors in terms of a local bias variance like tradeoff. This type of local analysis of TVD/MTF is new and arguably simpler than existing analyses of TVD/Trend Fil-tering. In particular, apart from minimax rate optimality over bounded variation and piecewise polynomial classes, our pointwise estimation error bounds also enable us to derive local rates of convergence for (locally) Holder Smooth signals. These local rates offer a new pointwise explanation of local adaptivity of TVD/MTF instead of existing global (MSE) based justifications.