Title: Extending the Scope of Nonparametric Empirical Bayes

 

Abstract: 

In this talk we will describe two applications of empirical Bayes (EB) methodology. EB procedures estimate the prior probability distribution (in a Bayesian statistical model) from the data. In the first part we study the (Gaussian) signal plus noise model with multivariate, heteroscedastic errors. This model arises in many large-scale denoising problems (e.g., in astronomy). We consider the nonparametric maximum likelihood estimator (NPMLE) in this setting. We study the characterization, uniqueness, and computation of the NPMLE which estimates the unknown (arbitrary) prior by solving an infinite-dimensional convex optimization problem. The EB posterior means based on the NPMLE have low regret, meaning they closely target the oracle posterior means one would compute with the true prior in hand. We demonstrate the adaptive and near-optimal properties of the NPMLE for density estimation, denoising and deconvolution.

In the second half of the talk, we consider the problem of Bayesian high dimensional regression where the regression coefficients are drawn i.i.d. from an unknown prior. To estimate this prior distribution, we propose and study a "variational empirical Bayes" approach — it combines EB inference with a variational approximation (VA). The idea is to approximate the intractable marginal log-likelihood of the response vector --- also known as the "evidence" --- by the evidence lower bound (ELBO) obtained from a naive mean field (NMF) approximation. We then maximize this lower bound over a suitable class of prior distributions in a computationally feasible way. We show that the marginal log-likelihood function can be (uniformly) approximated by its mean field counterpart. More importantly, under suitable conditions, we establish that this strategy leads to consistent approximation of the true posterior and provides asymptotically valid posterior inference for the regression coefficients.

 

Bio: 

Bodhi Sen is a Professor of Statistics at Columbia University, New York. He completed his Ph.D in Statistics from University of Michigan, Ann Arbor, in 2008. Prior to that, he was a student at the Indian Statistical Institute, Kolkata, where he received his Bachelors (2002) and Masters (2004) in Statistics. His core statistical research centers around nonparametrics --- function estimation (with special emphasis on shape constrained estimation), theory of optimal transport and its applications to statistics, empirical Bayes procedures, kernel methods, likelihood and bootstrap based inference, etc. He is also actively involved in interdisciplinary research, especially in astronomy.

His honors include the NSF CAREER award (2012), and the Young Statistical Scientist Award (YSSA) in the Theory and Methods category from the International Indian Statistical Association (IISA). He is an elected fellow of the Institute of Mathematical Statistics (IMS).