Predictive variational inference for flexible regression models

πŸ“… 2026-02-25
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Traditional Bayesian prediction relies on approximation methods when the posterior is analytically intractable, often struggling to balance predictive accuracy with model diagnostics. This work proposes Gaussian Mixture Predictive Variational Inference (GM-PVI), which unifies prediction and diagnostics through a Gaussian mixture variational posterior, substantially enhancing the flexibility and interpretability of regression models. We establish an equivalence between GM-PVI and plug-in predictions from a mixture-of-experts model with covariate-dependent weights, and demonstrate its efficacy across generalized linear models, linear mixed models, and latent Gaussian process models. Our analysis reveals how model parameters adaptively adjust with covariates to optimize predictive performance.

Technology Category

Application Category

πŸ“ Abstract
A conventional Bayesian approach to prediction uses the posterior distribution to integrate out parameters in a density for unobserved data conditional on the observed data and parameters. When the true posterior is intractable, it is replaced by an approximation; here we focus on variational approximations. Recent work has explored methods that learn posteriors optimized for predictive accuracy under a chosen scoring rule, while regularizing toward the prior or conventional posterior. Our work builds on an existing predictive variational inference (PVI) framework that improves prediction, but also diagnoses model deficiencies through implicit model expansion. In models where the sampling density depends on the parameters through a linear predictor, we improve the interpretability of existing PVI methods as a diagnostic tool. This is achieved by adopting PVI posteriors of Gaussian mixture form (GM-PVI) and establishing connections with plug-in prediction for mixture-of-experts models. We make three main contributions. First, we show that GM-PVI prediction is equivalent to plug-in prediction for certain mixture-of-experts models with covariate-independent weights in generalized linear models and hierarchical extensions of them. Second, we extend standard PVI by allowing GM-PVI posteriors to vary with the prediction covariate and in this case an equivalence to plug-in prediction for mixtures of experts with covariate-dependent weights is established. Third, we demonstrate the diagnostic value of this approach across several examples, including generalized linear models, linear mixed models, and latent Gaussian process models, demonstrating how the parameters of the original model must vary across the covariate space to achieve improvements in prediction.
Problem

Research questions and friction points this paper is trying to address.

predictive variational inference
model diagnostics
flexible regression models
mixture-of-experts
Bayesian prediction
Innovation

Methods, ideas, or system contributions that make the work stand out.

Predictive Variational Inference
Gaussian Mixture Posterior
Mixture-of-Experts
Model Diagnostics
Covariate-Dependent Posterior
πŸ”Ž Similar Papers
No similar papers found.
Lucas Kock
Lucas Kock
National University of Singapore
S
Scott A. Sisson
School of Mathematics and Statistics, University of New South Wales, Australia
G
G. S. Rodrigues
Department of Statistics, University of BrasΓ­lia, Brazil
D
David J. Nott
Department of Statistics and Data Science, National University of Singapore, Singapore