Variational Inference for Latent Variable Models in High Dimensions

📅 2025-06-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the statistical approximation accuracy of mean-field variational inference (MFVI) for posterior distributions in high-dimensional Bayesian latent variable models—specifically Latent Dirichlet Allocation (LDA) and the Mixed Membership Stochastic Blockmodel (MMSB). Addressing gaps in existing MFVI theory—including lack of rigorous statistical guarantees, unclear applicability boundaries, and suboptimality in MMSB—we develop a general asymptotic analysis framework to characterize the posterior approximation capability of MFVI. We establish, for the first time, the exact phase transition threshold for “successful” MFVI in LDA. We further prove that standard MFVI fails to achieve the information-theoretically optimal convergence rate in MMSB, and propose a theoretically grounded grouped variational algorithm. We rigorously show that this algorithm achieves tight information-theoretic optimality in both LDA and MMSB, substantially improving posterior inference accuracy and robustness.

Technology Category

Application Category

📝 Abstract
Variational inference (VI) is a popular method for approximating intractable posterior distributions in Bayesian inference and probabilistic machine learning. In this paper, we introduce a general framework for quantifying the statistical accuracy of mean-field variational inference (MFVI) for posterior approximation in Bayesian latent variable models with categorical local latent variables. Utilizing our general framework, we capture the exact asymptotic regime where MFVI `works' for the celebrated latent Dirichlet allocation (LDA) model. Focusing on the mixed membership stochastic blockmodel (MMSB), we show that the vanilla fully factorized MFVI, often used in the literature, is suboptimal. We propose a partially grouped VI algorithm for this model and show that it works, and derive its exact asymptotic performance. We further illustrate that our bounds are tight for both the above models.
Problem

Research questions and friction points this paper is trying to address.

Quantify accuracy of mean-field VI for Bayesian latent variable models
Analyze MFVI performance in latent Dirichlet allocation (LDA)
Improve suboptimal MFVI in mixed membership stochastic blockmodel (MMSB)
Innovation

Methods, ideas, or system contributions that make the work stand out.

General framework for MFVI statistical accuracy
Partially grouped VI algorithm for MMSB
Exact asymptotic performance derivation
🔎 Similar Papers
No similar papers found.
C
Chenyang Zhong
Department of Statistics, Columbia University
S
S. Mukherjee
Department of Statistics, Columbia University
Bodhisattva Sen
Bodhisattva Sen
Columbia University, New York
Nonparametric statisticsEmpirical BayesOptimal TransportAstro-statistics