Deviation Inequalities for Rényi Divergence Estimators via Variational Expression

📅 2025-08-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the weak probabilistic bias analysis in Rényi divergence estimation. For the first time, it establishes **non-asymptotic exponential bias inequalities** for both smoothed interpolation estimators and neural network estimators—under **general conditions that do not require compact support or strictly positive densities**. Methodologically, it combines variational representations with empirical process theory to derive one-sided concentration bounds. The main contributions are: (1) removing restrictive regularity assumptions on underlying distributions imposed by prior work; (2) providing rigorous probabilistic error control for continuous alphabets; and (3) delivering the first non-asymptotic performance guarantee for hypothesis testing in Rényi differential privacy auditing. These results advance the statistical foundations of information-theoretic estimation and privacy-preserving machine learning.

Technology Category

Application Category

📝 Abstract
Rényi divergences play a pivotal role in information theory, statistics, and machine learning. While several estimators of these divergences have been proposed in the literature with their consistency properties established and minimax convergence rates quantified, existing accounts of probabilistic bounds governing the estimation error are premature. Here, we make progress in this regard by establishing exponential deviation inequalities for smoothed plug-in estimators and neural estimators by relating the error to an appropriate empirical process and leveraging tools from empirical process theory. In particular, our approach does not require the underlying distributions to be compactly supported or have densities bounded away from zero, an assumption prevalent in existing results. The deviation inequality also leads to a one-sided concentration bound from the expectation, which is useful in random-coding arguments over continuous alphabets in information theory with potential applications to physical-layer security. As another concrete application, we consider a hypothesis testing framework for auditing Rényi differential privacy using the neural estimator as a test statistic and obtain non-asymptotic performance guarantees for such a test.
Problem

Research questions and friction points this paper is trying to address.

Estimate Rényi divergences with probabilistic error bounds
Remove restrictive assumptions on underlying distributions
Apply results to privacy auditing and information theory
Innovation

Methods, ideas, or system contributions that make the work stand out.

Exponential deviation inequalities for Rényi divergence estimators
Leveraging empirical process theory for error analysis
Neural estimators for auditing Rényi differential privacy
🔎 Similar Papers
No similar papers found.