Yuling Yan joined University of Wisconsin-Madison as an assistant professor of statistics in Fall 2024. Prior to that, he spent one-year as a postdoc at MIT. He received his Ph.D. from Princeton University in 2023, and his bachelor's degree from Peking University in 2018. His research interests include statistics, optimization, and their applications to AI and social sciences. He has received the IMS Lawrence D. Brown Award, the Norbert Wiener Postdoctoral Fellowship from MIT, and the Charlotte Elizabeth Procter Honorific Fellowship from Princeton University.
Talk: Fast Convergence Theory and Acceleration Algorithms for Diffusion Models
Abstract: Diffusion models have revolutionized the field of generative modeling, achieving unprecedented success in generating realistic and diverse content. Despite empirical advances, the theoretical foundations remain underdeveloped. In this talk, I will present a fast convergence theory for a popular SDE-based sampler under minimal assumptions. Specifically, we demonstrate that with L2-accurate score function estimates, the TV distance between the target and generated distributions is bounded by O(d/T) (up to log factors), where d is the data dimensionality and T is the number of steps. When the target distribution is concentrated on or near low-dimensional manifolds within the higher-dimensional ambient space — a common characteristic of natural image distributions — our analysis shows that the SDE-based sampler can adapt to this unknown low-dimensional structure. If time permits, I will also introduce a fast solver for the diffusion ODE, which enables high-quality image generation in approximately five steps.