Time: 4:30-5:30 p.m.
Date: Wednesday, February 18, 2026
Speaker: Florentin Guth, Faculty Fellow, Center for Data Science, NYU
Title: Learning normalized probability models with dual score matching
 

A color photo of a man with glasses smiling for a photo.


Abstract: Learning probability models from data is at the heart of many learning tasks. We introduce a new framework for learning normalized energy (log probability) models inspired from diffusion generative models. The energy model is fitted to data by two “score matching” objectives: the first constrains the gradient of the energy (the “score”, as in diffusion models), while the second constrains its *time derivative* along the diffusion. We validate the approach on both synthetic and natural image data: in particular, we show that the estimated log probabilities do not depend on the specific images used during training. Using our learned energy model, we demonstrate that both image probability and local dimensionality vary significantly with image content, challenging simple interpretations of the manifold hypothesis.

Bio: Florentin Guth is a Faculty Fellow in the Center for Data Science at NYU and a Research Fellow in the Center for Computational Neuroscience at the Flatiron Institute. He previously completed his PhD at École Normale Supérieure in Paris advised by Stéphane Mallat. He is interested in improving our scientific understanding of deep learning: answering why neural networks generalize, what are their inductive biases, and what properties of natural data underlies their success.