Prior- and likelihood-free probabilistic inference with finite-sample calibration guarantees

📅 2026-03-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of conducting calibrated Bayesian inference for parametric models whose likelihood functions are intractable, numerically unstable, or computationally prohibitive. Existing approaches lack finite-sample calibration guarantees under such conditions. The authors propose a fully probabilistic inference framework that requires neither a prior nor a likelihood, relying solely on the model’s simulation capability. By leveraging permutation-invariant functions—such as depth functions—to rank parameters and introducing a closed-form rescaling procedure, the method achieves finite-sample frequentist calibration. To the best of the authors’ knowledge, this is the first approach to provide theoretical calibration guarantees in a setting devoid of both likelihood and prior specifications. Empirical evaluations across four benchmark tasks—including differential privacy and the Ising model—as well as a spatial analysis of the 2025 U.S. measles outbreak demonstrate the method’s strong practical utility and robustness.

Technology Category

Application Category

📝 Abstract
Motivated by parametric models for which the likelihood is analytically unavailable, numerically unstable, or prohibitively expensive to compute or optimize, we develop a prior- and likelihood-free framework for fully probabilistic (Bayesian-like) uncertainty quantification with finite-sample calibration guarantees. Our method, a type of inferential model, produces data-dependent degrees of belief about claims concerning the unknown parameter while controlling the frequency with which high belief is assigned to false claims, even in finite-sample settings. Our procedure is general in that it requires only the ability to simulate from the model. We first rank candidate parameter values according to how well data simulated from the model agree with the observed data, and then rescale these rankings in a way that yields the desired finite-sample calibration guarantees. The key idea is to employ a permutation-invariant function, such as a depth function, to rank parameter values. We show that such a choice yields closed-form calibration rescaling calculations, making the procedure computationally simple. We illustrate our method's broad appeal with four examples, including differential privacy and Ising models. An analysis of the spatial configuration of 2025 measles outbreaks in the U.S. showcases our method's practical advantages.
Problem

Research questions and friction points this paper is trying to address.

likelihood-free inference
prior-free inference
finite-sample calibration
uncertainty quantification
probabilistic inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

likelihood-free inference
finite-sample calibration
inferential models
permutation-invariant ranking
simulation-based inference
🔎 Similar Papers
No similar papers found.