Theodor Misiakiewicz is an Assistant Professor in the Department of Statistics and Data Science at Yale University. He obtained his Ph.D. from Stanford in June 2023 and was a Research Assistant Professor at TTIC from September 2023 to June 2024. His research lies broadly at the intersection of machine learning and statistics, with a particular focus on deep learning theory.
Talk: Deterministic Equivalents and Scaling Laws for Random Feature Regression
Abstract: In this talk, we revisit random feature ridge regression (RFRR), a model that shares a number of phenomena with deep learning—such as double descent, benign overfitting, and scaling laws. Our main contribution is the derivation of a general deterministic equivalent for the test error of RFRR. Specifically, under certain concentration properties, we show that the test error is well approximated by a closed-form expression that only depends on the feature map eigenvalues. Notably, our approximation guarantees are non-asymptotic, multiplicative, and independent of the feature map dimension.
These guarantees depart from usual random matrix theory results which are largely focused on proportional asymptotics and additive error. In contrast, our theory applies to features that come from infinite-dimensional Hilbert spaces—the typical setting in machine learning—, and to models where the test error is itself polynomially vanishing.
To illustrate this precise characterization, we derive tight benign overfitting guarantees and sharp decay rates for RFRR under standard power-law assumptions on the spectrum and target decay.
This is based on joint work with Basil Saeed (Stanford), Leonardo Defilippis (ENS), and Bruno Loureiro (ENS).