Simple parallel estimation of the partition ratio for Gibbs distributions

📅 2025-05-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies efficient parallel estimation of the log-ratio $ q = ln(Z(eta_{max})/Z(eta_{min})) $ of Gibbs partition functions, where the Hamiltonian takes values in $[1,n]$. We propose the first concise analytical framework relying on a *single* estimator—departing from conventional two-estimator piecewise strategies. Our methods include a non-adaptive parallel algorithm and a two-round adaptive parallel algorithm, achieving sample complexities of $ O(q log^2 n / varepsilon^2) $ and the optimal $ O(q log n / varepsilon^2) $, respectively; the latter matches the sequential lower bound. Key techniques integrate importance sampling with temperature interval partitioning and establish a unified single-estimator error analysis paradigm. Theoretical results substantially improve upon prior work: for accuracy $ varepsilon $, our approach enhances parallel efficiency by a factor of $ Theta(log n) $.

Technology Category

Application Category

📝 Abstract
We consider the problem of estimating the partition function $Z(eta)=sum_x exp(eta(H(x))$ of a Gibbs distribution with the Hamiltonian $H:Omega ightarrow{0}cup[1,n]$. As shown in [Harris&Kolmogorov 2024], the log-ratio $q=ln (Z(eta_{max})/Z(eta_{min}))$ can be estimated with accuracy $epsilon$ using $O(frac{q log n}{epsilon^2})$ calls to an oracle that produces a sample from the Gibbs distribution for parameter $etain[eta_{min},eta_{max}]$. That algorithm is inherently sequential, or {em adaptive}: the queried values of $eta$ depend on previous samples. Recently, [Liu, Yin&Zhang 2024] developed a non-adaptive version that needs $O( q (log^2 n) (log q + log log n + epsilon^{-2}) )$ samples. We improve the number of samples to $O(frac{q log^2 n}{epsilon^2})$ for a non-adaptive algorithm, and to $O(frac{q log n}{epsilon^2})$ for an algorithm that uses just two rounds of adaptivity (matching the complexity of the sequential version). Furthermore, our algorithm simplifies previous techniques. In particular, we use just a single estimator, whereas methods in [Harris&Kolmogorov 2024, Liu, Yin&Zhang 2024] employ two different estimators for different regimes.
Problem

Research questions and friction points this paper is trying to address.

Estimating partition function for Gibbs distributions
Improving non-adaptive algorithm sample efficiency
Reducing adaptivity rounds while maintaining accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Non-adaptive algorithm reduces sample complexity
Two-round adaptivity matches sequential complexity
Simplifies techniques with single estimator
🔎 Similar Papers
No similar papers found.
D
David G. Harris
University of Maryland, Department of Computer Science
Vladimir Kolmogorov
Vladimir Kolmogorov
IST Austria