🤖 AI Summary
To address the high computational complexity, degraded predictive performance, and difficulty in variable selection inherent in high-dimensional Gaussian process regression (GPR), this paper proposes Bayesian Bridge Gaussian Process Regression (B²GPR). B²GPR is the first framework to integrate the Bayesian bridge prior—characterized by an ℓ<sub>q</sub> norm penalty (0 < q ≤ 2)—into GPR, enabling automatic sparse variable selection and interpretable modeling. To handle the nonstandard posterior distribution, we develop a hybrid Bayesian inference algorithm combining spherical Hamiltonian Monte Carlo (Spherical HMC) with Gibbs sampling. Theoretical analysis and extensive experiments on both synthetic and real-world datasets demonstrate that B²GPR significantly outperforms state-of-the-art methods: it achieves superior prediction accuracy while simultaneously enhancing identification of relevant variables and promoting model sparsity. Thus, B²GPR establishes a new paradigm for high-dimensional GPR that jointly ensures computational efficiency, predictive fidelity, and interpretability.
📝 Abstract
The performance of Gaussian Process (GP) regression is often hampered by the curse of dimensionality, which inflates computational cost and reduces predictive power in high-dimensional problems. Variable selection is thus crucial for building efficient and accurate GP models. Inspired by Bayesian bridge regression, we propose the Bayesian Bridge Gaussian Process Regression (B extsuperscript{2}GPR) model. This framework places $ell_q$-norm constraints on key GP parameters to automatically induce sparsity and identify active variables. We formulate two distinct versions: one for $q=2$ using conjugate Gaussian priors, and another for $0<q<2$ that employs constrained flat priors, leading to non-standard, norm-constrained posterior distributions. To enable posterior inference, we design a Gibbs sampling algorithm that integrates Spherical Hamiltonian Monte Carlo (SphHMC) to efficiently sample from the constrained posteriors when $0<q<2$. Simulations and a real-data application confirm that B extsuperscript{2}GPR offers superior variable selection and prediction compared to alternative approaches.