Neural Surrogate HMC: Accelerated Hamiltonian Monte Carlo with a Neural Network Surrogate Likelihood

📅 2024-07-29
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Hamiltonian Monte Carlo (HMC) sampling is infeasible for expensive Bayesian inference problems where the likelihood is implicitly defined by partial differential equations (PDEs), due to prohibitive computational cost per gradient evaluation. Method: We propose a neural surrogate likelihood approach that models the intractable likelihood with a differentiable neural network, seamlessly integrated into the HMC framework. The surrogate is trained end-to-end via joint optimization of automatic differentiation and a PDE solver, ensuring statistical consistency while enabling exact gradient computation. Contribution/Results: Our method achieves three key advantages: gradient acceleration, numerical noise robustness, and computational amortization. Evaluated on the inverse problem of galactic cosmic-ray propagation governed by the Parker equation, it improves posterior sampling efficiency by over an order of magnitude compared to standard simulation-based inference (SBI), thereby overcoming scalability bottlenecks of SBI in high-fidelity, computationally intensive physical models.

Technology Category

Application Category

📝 Abstract
Bayesian Inference with Markov Chain Monte Carlo requires efficient computation of the likelihood function. In some scientific applications, the likelihood must be computed by numerically solving a partial differential equation, which can be prohibitively expensive. We demonstrate that some such problems can be made tractable by amortizing the computation with a surrogate likelihood function implemented by a neural network. We show that this has two additional benefits: reducing noise in the likelihood evaluations and providing fast gradient calculations. In experiments, the approach is applied to a model of heliospheric transport of galactic cosmic rays, where it enables efficient sampling from the posterior of latent parameters in the Parker equation.
Problem

Research questions and friction points this paper is trying to address.

Amortizes likelihood computations for MCMC using neural networks.
Provides gradients for Hamiltonian Monte Carlo via neural approximations.
Smooths noisy simulations from numerical instabilities in inference.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural networks approximate likelihood for Hamiltonian Monte Carlo
Amortizes computations and provides gradients for MCMC
Smooths noisy simulations from numerical instabilities
🔎 Similar Papers
No similar papers found.