Nonparametric Variational Differential Privacy via Embedding Parameter Clipping

📅 2026-03-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In nonparametric variational differentially private models, latent representations tend to drift toward high-information regions, resulting in weakened privacy guarantees, unstable training, and degraded utility. This work proposes, for the first time, a theoretically grounded embedding parameter clipping strategy derived from the perspective of Rényi divergence optimization. By imposing constraints on the posterior mean, variance, and mixture weights, the method minimizes an upper bound on the Rényi divergence. This approach unifies privacy enhancement with performance improvement, significantly advancing the privacy-utility trade-off across multiple downstream tasks while yielding a tighter bound on the Rényi divergence.

Technology Category

Application Category

📝 Abstract
The nonparametric variational information bottleneck (NVIB) provides the foundation for nonparametric variational differential privacy (NVDP), a framework for building privacy-preserving language models. However, the learned latent representations can drift into regions with high information content, leading to poor privacy guarantees, but also low utility due to numerical instability during training. In this work, we introduce a principled parameter clipping strategy to directly address this issue. Our method is mathematically derived from the objective of minimizing the Rényi Divergence (RD) upper bound, yielding specific, theoretically grounded constraints on the posterior mean, variance, and mixture weight parameters. We apply our technique to an NVIB based model and empirically compare it against an unconstrained baseline. Our findings demonstrate that the clipped model consistently achieves tighter RD bounds, implying stronger privacy, while simultaneously attaining higher performance on several downstream tasks. This work presents a simple yet effective method for improving the privacy-utility trade-off in variational models, making them more robust and practical.
Problem

Research questions and friction points this paper is trying to address.

nonparametric variational differential privacy
latent representation drift
privacy-utility trade-off
numerical instability
Rényi Divergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Nonparametric Variational Differential Privacy
Parameter Clipping
Rényi Divergence
Variational Information Bottleneck
Privacy-Utility Trade-off
🔎 Similar Papers
No similar papers found.