🤖 AI Summary
This paper addresses the problem of efficiently approximating $f$-divergences between Ising models—i.e., estimating, within arbitrary relative error, the divergence between two Gibbs distributions defined by distinct interaction matrices and external fields. We propose the first polynomial-time approximation algorithm for $chi^alpha$-divergences and establish matching, tight computational complexity lower bounds across the full parameter range. Our method unifies treatment of key divergences including KL, Rényi, and Jensen–Shannon divergences. The core contribution lies in a novel integration of statistical physics structure intrinsic to the Ising model—such as Gaussian approximations and critical phase-transition behavior—with fine-grained computational complexity analysis, yielding a parameter-adaptive framework for sampling and moment estimation. The algorithm achieves theoretically optimal approximation accuracy and efficiency over broad regimes of temperature and field strength. This work provides the first general-purpose algorithmic paradigm for quantifying divergence between high-dimensional probability distributions, accompanied by rigorous theoretical guarantees.
📝 Abstract
The $f$-divergence is a fundamental notion that measures the difference between two distributions. In this paper, we study the problem of approximating the $f$-divergence between two Ising models, which is a generalization of recent work on approximating the TV-distance. Given two Ising models $ν$ and $μ$, which are specified by their interaction matrices and external fields, the problem is to approximate the $f$-divergence $D_f(ν,|,μ)$ within an arbitrary relative error $mathrm{e}^{pm varepsilon}$. For $χ^α$-divergence with a constant integer $α$, we establish both algorithmic and hardness results. The algorithm works in a parameter regime that matches the hardness result. Our algorithm can be extended to other $f$-divergences such as $α$-divergence, Kullback-Leibler divergence, Rényi divergence, Jensen-Shannon divergence, and squared Hellinger distance.