Bayesian Compressed Mixed-Effects Models

📅 2025-07-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the computational bottleneck in Bayesian inference for high-dimensional linear mixed-effects models—arising from the random effects covariance matrix—this paper proposes the Compressed Mixed-Effects (CME) model. CME employs random projection to map the high-dimensional covariance structure into a lower-dimensional space and integrates a global-local shrinkage prior to enable simultaneous fixed-effect selection and efficient prediction. Theoretically, CME achieves asymptotically negligible prediction risk. Algorithmically, it leverages a quasi-likelihood framework coupled with a collapsed Gibbs sampler, substantially reducing computational complexity. Extensive simulations and real-data experiments demonstrate that CME consistently outperforms existing methods in prediction accuracy, coverage probability of confidence intervals, and variable selection consistency. Thus, CME provides a scalable, theoretically grounded, and empirically effective solution for Bayesian inference in high-dimensional mixed-effects models.

Technology Category

Application Category

📝 Abstract
Penalized likelihood and quasi-likelihood methods dominate inference in high-dimensional linear mixed-effects models. Sampling-based Bayesian inference is less explored due to the computational bottlenecks introduced by the random effects covariance matrix. To address this gap, we propose the compressed mixed-effects (CME) model, which defines a quasi-likelihood using low-dimensional covariance parameters obtained via random projections of the random effects covariance. This dimension reduction, combined with a global-local shrinkage prior on the fixed effects, yields an efficient collapsed Gibbs sampler for prediction and fixed effects selection. Theoretically, when the compression dimension grows slowly relative to the number of fixed effects and observations, the Bayes risk for prediction is asymptotically negligible, ensuring accurate prediction using the CME model. Empirically, the CME model outperforms existing approaches in terms of predictive accuracy, interval coverage, and fixed-effects selection across varied simulation settings and a real-world dataset.
Problem

Research questions and friction points this paper is trying to address.

Address computational bottlenecks in Bayesian mixed-effects models
Improve predictive accuracy and fixed-effects selection
Reduce dimensionality via compressed random effects covariance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses low-dimensional covariance via random projections
Employs global-local shrinkage prior for fixed effects
Efficient collapsed Gibbs sampler for prediction
🔎 Similar Papers
No similar papers found.