Piecewise Deterministic Markov Processes for Bayesian Neural Networks

📅 2023-02-17
🏛️ Conference on Uncertainty in Artificial Intelligence
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limitations of variational inference—its restrictive independence assumption—and traditional MCMC—its incompatibility with mini-batch sampling—in Bayesian neural network (BNN) posterior inference, this paper proposes a subsampling-compatible piecewise deterministic Markov process (PDMP) framework. Our key contribution is a generic adaptive thinning algorithm that enables efficient simulation of the required inhomogeneous Poisson process (IPP) without model-specific constructions—a first in PDMP-based BNN inference—thereby substantially improving generalizability and ease of use. Extensive evaluation on benchmark tasks demonstrates that our method achieves superior computational efficiency and sampling quality: it outperforms state-of-the-art approximate inference methods in predictive accuracy, MCMC mixing speed, and uncertainty calibration.
📝 Abstract
Inference on modern Bayesian Neural Networks (BNNs) often relies on a variational inference treatment, imposing violated assumptions of independence and the form of the posterior. Traditional MCMC approaches avoid these assumptions at the cost of increased computation due to its incompatibility to subsampling of the likelihood. New Piecewise Deterministic Markov Process (PDMP) samplers permit subsampling, though introduce a model specific inhomogenous Poisson Process (IPPs) which is difficult to sample from. This work introduces a new generic and adaptive thinning scheme for sampling from these IPPs, and demonstrates how this approach can accelerate the application of PDMPs for inference in BNNs. Experimentation illustrates how inference with these methods is computationally feasible, can improve predictive accuracy, MCMC mixing performance, and provide informative uncertainty measurements when compared against other approximate inference schemes.
Problem

Research questions and friction points this paper is trying to address.

Develops adaptive thinning for inhomogeneous Poisson Processes
Enables efficient PDMP sampling in Bayesian Neural Networks
Improves computational feasibility and predictive accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive thinning for inhomogeneous Poisson sampling
Generic scheme accelerating PDMP inference
Enabling subsampling-compatible MCMC for neural networks
🔎 Similar Papers
No similar papers found.
E
Ethan Goan
Queensland University of Technology
Dimitri Perrin
Dimitri Perrin
Queensland University of Technology
Data ScienceCRISPRTissue clearingBioinformaticsBiomedical Imaging
K
K. Mengersen
Queensland University of Technology
C
C. Fookes
Queensland University of Technology