Randomised Postiterations for Calibrated BayesCG

📅 2025-04-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Bayesian Conjugate Gradient (BayesCG) provides probabilistic solutions but suffers from severely miscalibrated posterior uncertainty, limiting its utility in uncertainty quantification. To address this, we propose a randomized post-iteration strategy—the first to incorporate stochastic perturbations into post-iteration construction—preserving the original convergence properties while theoretically correcting systematic biases in the posterior error distribution. This overcomes the fundamental limitation of deterministic post-iteration methods, which cannot improve calibration. Our approach unifies Bayesian inference, randomized numerical linear algebra, and posterior calibration analysis. Experiments on synthetic data and inverse problems demonstrate substantial improvements in confidence interval coverage—approaching nominal levels—and robust propagation of uncertainty through multi-step computations. The method establishes a new pathway toward reliability in probabilistic numerical methods.

Technology Category

Application Category

📝 Abstract
The Bayesian conjugate gradient method offers probabilistic solutions to linear systems but suffers from poor calibration, limiting its utility in uncertainty quantification tasks. Recent approaches leveraging postiterations to construct priors have improved computational properties but failed to correct calibration issues. In this work, we propose a novel randomised postiteration strategy that enhances the calibration of the BayesCG posterior while preserving its favourable convergence characteristics. We present theoretical guarantees for the improved calibration, supported by results on the distribution of posterior errors. Numerical experiments demonstrate the efficacy of the method in both synthetic and inverse problem settings, showing enhanced uncertainty quantification and better propagation of uncertainties through computational pipelines.
Problem

Research questions and friction points this paper is trying to address.

Improve calibration of BayesCG for uncertainty quantification
Address poor calibration in Bayesian conjugate gradient method
Enhance posterior calibration while preserving convergence properties
Innovation

Methods, ideas, or system contributions that make the work stand out.

Randomised postiteration strategy enhances calibration
Preserves BayesCG favorable convergence characteristics
Theoretical guarantees for improved posterior calibration
🔎 Similar Papers
No similar papers found.