Deep Gaussian Process Priors for Bayesian Image Reconstruction

📅 2024-12-13
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of uncertainty quantification in image reconstruction arising from non-stationarity and multi-scale structures, this paper proposes a Bayesian prior modeling framework based on deep Gaussian processes (Deep GPs). Methodologically, it integrates rational approximations of fractional stochastic partial differential equations (SPDEs) with determinant-free Markov chain Monte Carlo (MCMC) to enable efficient high-dimensional posterior sampling for Deep GPs. A key contribution is the introduction of an interpretable, tunable regularity parameter that explicitly governs prior smoothness, yielding a controllable non-stationary prior paradigm. Evaluated on super-resolution, edge detection, and CT reconstruction tasks, the method achieves superior reconstruction accuracy and uncertainty calibration compared to conventional stationary GP priors. Furthermore, it enables systematic analysis and interpretation of the regularity parameter, facilitating principled model selection and physical interpretability.

Technology Category

Application Category

📝 Abstract
In image reconstruction, an accurate quantification of uncertainty is of great importance for informed decision making. Here, the Bayesian approach to inverse problems can be used: the image is represented through a random function that incorporates prior information which is then updated through Bayes' formula. However, finding a prior is difficult, as images often exhibit non-stationary effects and multi-scale behaviour. Thus, usual Gaussian process priors are not suitable. Deep Gaussian processes, on the other hand, encode non-stationary behaviour in a natural way through their hierarchical structure. To apply Bayes' formula, one commonly employs a Markov chain Monte Carlo (MCMC) method. In the case of deep Gaussian processes, sampling is especially challenging in high dimensions: the associated covariance matrices are large, dense, and changing from sample to sample. A popular strategy towards decreasing computational complexity is to view Gaussian processes as the solutions to a fractional stochastic partial differential equation (SPDE). In this work, we investigate efficient computational strategies to solve the fractional SPDEs occurring in deep Gaussian process sampling, as well as MCMC algorithms to sample from the posterior. Namely, we combine rational approximation and a determinant-free sampling approach to achieve sampling via the fractional SPDE. We test our techniques in standard Bayesian image reconstruction problems: upsampling, edge detection, and computed tomography. In these examples, we show that choosing a non-stationary prior such as the deep GP over a stationary GP can improve the reconstruction. Moreover, our approach enables us to compare results for a range of fractional and non-fractional regularity parameter values.
Problem

Research questions and friction points this paper is trying to address.

Addresses non-stationary image reconstruction uncertainty quantification
Overcomes high-dimensional sampling challenges in deep Gaussian processes
Improves Bayesian image reconstruction via fractional SPDE techniques
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep Gaussian processes for non-stationary image priors
Fractional SPDEs for efficient sampling
Determinant-free MCMC for posterior sampling
🔎 Similar Papers
No similar papers found.
Jonas Latz
Jonas Latz
University of Manchester
Bayesian InferenceNumerical AnalysisData ScienceUncertainty QuantificationMachine Learning
A
A. Teckentrup
School of Mathematics and Maxwell Institute of Mathematical Sciences, University of Edinburgh, UK
S
Simon Urbainczyk
School of Mathematical and Computer Sciences and Maxwell Institute of Mathematical Sciences, Heriot-Watt University, Edinburgh, UK