On approximating the $f$-divergence between two Ising models

📅 2025-09-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the problem of efficiently approximating $f$-divergences between Ising models—i.e., estimating, within arbitrary relative error, the divergence between two Gibbs distributions defined by distinct interaction matrices and external fields. We propose the first polynomial-time approximation algorithm for $chi^alpha$-divergences and establish matching, tight computational complexity lower bounds across the full parameter range. Our method unifies treatment of key divergences including KL, Rényi, and Jensen–Shannon divergences. The core contribution lies in a novel integration of statistical physics structure intrinsic to the Ising model—such as Gaussian approximations and critical phase-transition behavior—with fine-grained computational complexity analysis, yielding a parameter-adaptive framework for sampling and moment estimation. The algorithm achieves theoretically optimal approximation accuracy and efficiency over broad regimes of temperature and field strength. This work provides the first general-purpose algorithmic paradigm for quantifying divergence between high-dimensional probability distributions, accompanied by rigorous theoretical guarantees.

Technology Category

Application Category

📝 Abstract
The $f$-divergence is a fundamental notion that measures the difference between two distributions. In this paper, we study the problem of approximating the $f$-divergence between two Ising models, which is a generalization of recent work on approximating the TV-distance. Given two Ising models $ν$ and $μ$, which are specified by their interaction matrices and external fields, the problem is to approximate the $f$-divergence $D_f(ν,|,μ)$ within an arbitrary relative error $mathrm{e}^{pm varepsilon}$. For $χ^α$-divergence with a constant integer $α$, we establish both algorithmic and hardness results. The algorithm works in a parameter regime that matches the hardness result. Our algorithm can be extended to other $f$-divergences such as $α$-divergence, Kullback-Leibler divergence, Rényi divergence, Jensen-Shannon divergence, and squared Hellinger distance.
Problem

Research questions and friction points this paper is trying to address.

Approximating f-divergence between two Ising models
Handling various divergence types with relative error
Establishing algorithmic and hardness results for χ^α-divergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Approximates f-divergence for Ising models
Handles multiple divergence types algorithmically
Matches hardness results with parameter regime
🔎 Similar Papers
No similar papers found.
Weiming Feng
Weiming Feng
The University of Hong Kong
randomized algorithms
Y
Yucheng Fu
School of Computing and Data Science, The University of Hong Kong