The Statistics Seminar speaker for Wednesday, May 5, 2021, is Emmanuel Abbé, chair and professor of mathematical data science at EPFL (École polytechnique fédérale de Lausanne) in Switzerland. Abbe obtained his M.Sc. in Mathematics at EPFL and his Ph.D. in EECS at MIT. He was an assistant professor and associate professor at Princeton University until 2018. His research interests are in information theory, machine learning and related mathematical fields. He is the recipient of the Foundation Latsis International Prize, the Bell Labs Prize, the von Neumann Fellowship, and the IEEE Information Theory Society Paper Award. He is part of the NSF-Simons Collaboration on the Theoretical Foundations of Deep Learning.
Talk: Fundamental limits of differentiable learning
The Zoom link to this virtual talk will be sent through SDS list servs
Abstract: We consider the statistical learning paradigm consisting of training general neural networks (NN) with (S)GD. What is the class of functions that can be learned with this paradigm? How does it compare to known classes of learning such as statistical query (SQ) learning? Is depth/overapametrization needed to learn certain classes? We show that SGD on NN is equivalent to PAC while GD on NN is equivalent to SQ, obtaining a separation between the classes learned by SGD and GD. We further give a strong separation with kernel methods, exhibiting function classes that kernels cannot learn with non-trivial edge but that GD on a differentiable model can learn with perfect edge. Based on joint works with C. Sandon and E. Malach, P. Kamath, N. Srebro.