🤖 AI Summary
For Bayesian inference with computationally expensive likelihood functions, this paper introduces Normalizing Flow Regression (NFR): a method that frames learning the log-posterior density as a supervised regression task—rather than conventional density estimation—using a normalizing flow. NFR is trained end-to-end on offline likelihood evaluations, requiring no additional sampling or iterative optimization. It enables analytic posterior sampling and joint estimation of model evidence. As the first regression-based normalizing flow paradigm, NFR eliminates reliance on surrogate modeling and black-box optimization. Evaluated on real-world neuroscience and biology tasks, it matches or surpasses state-of-the-art methods in accuracy while significantly reducing computational cost. By directly leveraging precomputed likelihood evaluations, NFR provides an efficient, robust, and scalable solution for Bayesian inference under high-fidelity, expensive-likelihood settings.
📝 Abstract
Bayesian inference with computationally expensive likelihood evaluations remains a significant challenge in many scientific domains. We propose normalizing flow regression (NFR), a novel offline inference method for approximating posterior distributions. Unlike traditional surrogate approaches that require additional sampling or inference steps, NFR directly yields a tractable posterior approximation through regression on existing log-density evaluations. We introduce training techniques specifically for flow regression, such as tailored priors and likelihood functions, to achieve robust posterior and model evidence estimation. We demonstrate NFR's effectiveness on synthetic benchmarks and real-world applications from neuroscience and biology, showing superior or comparable performance to existing methods. NFR represents a promising approach for Bayesian inference when standard methods are computationally prohibitive or existing model evaluations can be recycled.