Skip to main content
Cornell university
Cornell Statistics and Data Science Cornell Statistics and Data Science
  • About Us

    About Us
    Cornell's Department of Statistics and Data Science offers four programs at the undergraduate and graduate levels. Undergraduates can earn a BA in statistical science, social...

    Welcome to the Department of Statistics and Data Science
    History
    Facilities
    Statistics Graduate Society
    Recently Published Papers
  • Academics

    Academics

    Undergraduate
    PhD
    MPS
    PhD Minor in Data Science
    Courses & Course Enrollment
  • People

    People

    Faculty
    Field Faculty
    PhDs
    Emeritus Faculty
    Academic Staff
    Staff
    Research Areas of Expertise
    Statistical Consultants
  • News and Events

    News and Events

    Events
    News
  • Resources

    Resources

    Professional Societies and Meetings
    Affiliated Groups
    Career Services
    Cornell Statistical Consulting Unit
  • Alumni

    Alumni
    Cornell's Statistics and Data Science degrees prepare students for a wide variety of careers, from academia to industry.  See the After Graduation page for a general overview of...

    Alumni Profiles

Search form

You are here

  1. Home 
  2. Events 
  3. Statistics Seminars

Statistics Seminar Speaker: Bodhi Sen 4/10/2024

Event Layout

Wednesday Apr 10 2024

Statistics Seminar Speaker: Bodhi Sen 4/10/2024

4:15pm @ G01 Biotech
In Statistics Seminars

Bodhi Sen is a Professor of Statistics at Columbia University, New York. He completed his Ph.D in Statistics from University of Michigan, Ann Arbor, in 2008. Prior to that, he was a student at the Indian Statistical Institute, Kolkata, where he received his Bachelors (2002) and Masters (2004) in Statistics. His core statistical research centers around nonparametrics --- function estimation (with special emphasis on shape constrained estimation), theory of optimal transport and its applications to statistics, empirical Bayes procedures, kernel methods, likelihood and bootstrap based inference, etc. He is also actively involved in interdisciplinary research, especially in astronomy.

His honors include the NSF CAREER award (2012), and the Young Statistical Scientist Award (YSSA) in the Theory and Methods category from the International Indian Statistical Association (IISA). He is an
elected fellow of the Institute of Mathematical Statistics (IMS).

Talk: Extending the Scope of Nonparametric Empirical Bayes

Abstract: In this talk we will describe two applications of empirical Bayes (EB) methodology. EB procedures estimate the prior probability distribution in a latent variable model or Bayesian model from the data. In the first part we study the (Gaussian) signal plus noise model with multivariate, heteroscedastic errors. This model arises in many large-scale denoising problems (e.g., in astronomy). We consider the nonparametric maximum likelihood estimator (NPMLE) in this setting. We study the characterization, uniqueness, and computation of the NPMLE which estimates the unknown (arbitrary) prior by solving an infinite-dimensional convex optimization problem. The EB posterior means based on the NPMLE have low regret, meaning they closely target the oracle posterior means one would compute with the true prior in hand. We demonstrate the adaptive and near-optimal properties of the NPMLE for density estimation, denoising and deconvolution.

In the second half of the talk, we consider the problem of Bayesian high dimensional regression where the regression coefficients are drawn i.i.d. from an unknown prior. To estimate this prior distribution, we propose and study a "variational empirical Bayes" approach — it combines EB inference with a variational approximation (VA). The idea is to approximate the intractable marginal log-likelihood of the response vector — also known as the "evidence" — by the evidence lower bound (ELBO) obtained from a naive mean field (NMF) approximation. We then maximize this lower bound over a suitable class of prior distributions in a computationally feasible way. We show that the marginal log-likelihood function can be (uniformly) approximated by its mean field counterpart. More importantly, under suitable conditions, we establish that this strategy leads to consistent approximation of the true posterior and provides asymptotically valid posterior inference for the regression coefficients.

Event Categories

  • Statistics Seminars
  • Special Events

Image Gallery

A color photo of a man smiling for a photo
  • Home
  • About Us
  • Contact Us
  • Careers
© Cornell University Department of Statistics and Data Science

1198 Comstock Hall, 129 Garden Ave., Ithaca, NY 14853

Social Menu

  • Facebook
  • Twitter
  • YouTube
Cornell Bowers CIS College of Computing and Information Science Cornell CALS ILR School

If you have a disability and are having trouble accessing information on this website or need materials in an alternate format, contact web-accessibility@cornell.edu for assistance.