🤖 AI Summary
Efficient, unbiased sampling from probability distributions constrained to convex sets remains challenging, especially due to projection overhead, numerical instability, and poor scalability in existing methods.
Method: This paper introduces a novel class of algorithms that integrate mirror mappings—rooted in convex optimization—with piecewise-deterministic Markov processes (PDMPs). The proposed framework leverages event-driven dynamics and geometric adaptivity to perform exact, projection-free, and unbiased sampling within convex constraint sets, while naturally supporting stochastic (mini-batch) gradients.
Contribution/Results: To our knowledge, this is the first work to incorporate mirror mappings into PDMP-based sampling. It overcomes fundamental limitations of mainstream SDE-based approaches in handling constraints, computational cost, and numerical robustness. Empirical evaluations across diverse convex-constrained sampling tasks demonstrate substantial improvements in both sampling efficiency and statistical accuracy. The method establishes a new paradigm for constrained Bayesian inference and optimization under convex constraints.
📝 Abstract
In this paper, we propose a novel class of Piecewise Deterministic Markov Processes (PDMP) that are designed to sample from constrained probability distributions $π$ supported on a convex set $mathcal{M}$. This class of PDMPs adapts the concept of a mirror map from convex optimisation to address sampling problems. Such samplers provides unbiased algorithms that respect the constraints and, moreover, allow for exact subsampling. We demonstrate the advantages of these algorithms on a range of constrained sampling problems where the proposed algorithm outperforms state of the art stochastic differential equation-based methods.