Gradient flows for empirical Bayes in high-dimensional linear models

Bibliographic Details
Title: Gradient flows for empirical Bayes in high-dimensional linear models
Authors: Fan, Zhou, Guan, Leying, Shen, Yandi, Wu, Yihong
Publication Year: 2023
Collection: Mathematics
Statistics
Subject Terms: Mathematics - Statistics Theory, Statistics - Methodology
More Details: Empirical Bayes provides a powerful approach to learning and adapting to latent structure in data. Theory and algorithms for empirical Bayes have a rich literature for sequence models, but are less understood in settings where latent variables and data interact through more complex designs. In this work, we study empirical Bayes estimation of an i.i.d. prior in Bayesian linear models, via the nonparametric maximum likelihood estimator (NPMLE). We introduce and study a system of gradient flow equations for optimizing the marginal log-likelihood, jointly over the prior and posterior measures in its Gibbs variational representation using a smoothed reparametrization of the regression coefficients. A diffusion-based implementation yields a Langevin dynamics MCEM algorithm, where the prior law evolves continuously over time to optimize a sequence-model log-likelihood defined by the coordinates of the current Langevin iterate. We show consistency of the NPMLE as $n, p \rightarrow \infty$ under mild conditions, including settings of random sub-Gaussian designs when $n \asymp p$. In high noise, we prove a uniform log-Sobolev inequality for the mixing of Langevin dynamics, for possibly misspecified priors and non-log-concave posteriors. We then establish polynomial-time convergence of the joint gradient flow to a near-NPMLE if the marginal negative log-likelihood is convex in a sub-level set of the initialization.
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2312.12708
Accession Number: edsarx.2312.12708
Database: arXiv
More Details
Description not available.