Strong Data Processing Properties of R'enyi-divergences via Pinsker-type Inequalities

📅 2025-01-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
研究通过固定信道处理的离散分布,利用Rényi散度和f-散度不等式,改进Pinsker不等式,适用于特定输入分布和信道,增强Rényi局部微分隐私。

Technology Category

Application Category

📝 Abstract
We investigate strong data processing inequalities (SDPIs) of the R'enyi-divergence between two discrete distributions when both distributions are passed through a fixed channel. We provide a condition on the channel for which the DPI holds with equality given two arbitrary distributions in the probability simplex. Motivated by this, we examine the contraction behavior for restricted sets of prior distributions via $f$-divergence inequalities: We provide an alternative proof of the optimal reverse Pinsker's inequality for R'enyi-divergences first shown by Binette. We further present an improved Pinsker's inequality for R'enyi-divergence based on the joint range technique by Harremo""es and Vajda. The presented bound is tight whenever the value of the total variation distance is larger than $frac{1}{alpha}$. By framing these inequalities in a cross-channel setting, we arrive at SDPIs that can be adapted to use-case specific restrictions of input distribution and channel. We apply these results to the R'enyi local differential privacy amplification through post-processing by channels that satisfy no local differential privacy guarantee.
Problem

Research questions and friction points this paper is trying to address.

Renyi Divergence
Data Processing Inequality
Renyi Local Differential Privacy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Rényi divergence
Pinsker inequality
Rényi differential privacy
🔎 Similar Papers
No similar papers found.