Monitoring Violations of Differential Privacy over Time

πŸ“… 2025-09-24
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the challenges of sustaining long-term auditing of differential privacy (DP) mechanisms, low sampling efficiency of static auditing methods, and diminishing reliability of audits over time, this paper proposes the first sustainable monitoring framework specifically designed for DP. The framework dynamically accumulates audit evidence from historical audit logs and integrates statistical hypothesis testing with DP theory to establish a formally verifiable correctness guarantee over time. It employs adaptive sampling and online parameter updating to substantially reduce sampling overhead while preserving high-accuracy estimation of privacy parameters. Theoretical analysis establishes formal soundness and rigor. Extensive experiments across multiple canonical DP mechanisms demonstrate its effectiveness: compared to static auditing, it reduces sampling requirements by over 60%, achieves superior privacy budget consumption, and exhibits strong robustness and practicality.

Technology Category

Application Category

πŸ“ Abstract
Auditing differential privacy has emerged as an important area of research that supports the design of privacy-preserving mechanisms. Privacy audits help to obtain empirical estimates of the privacy parameter, to expose flawed implementations of algorithms and to compare practical with theoretical privacy guarantees. In this work, we investigate an unexplored facet of privacy auditing: the sustained auditing of a mechanism that can go through changes during its development or deployment. Monitoring the privacy of algorithms over time comes with specific challenges. Running state-of-the-art (static) auditors repeatedly requires excessive sampling efforts, while the reliability of such methods deteriorates over time without proper adjustments. To overcome these obstacles, we present a new monitoring procedure that extracts information from the entire deployment history of the algorithm. This allows us to reduce sampling efforts, while sustaining reliable outcomes of our auditor. We derive formal guarantees with regard to the soundness of our methods and evaluate their performance for important mechanisms from the literature. Our theoretical findings and experiments demonstrate the efficacy of our approach.
Problem

Research questions and friction points this paper is trying to address.

Auditing differential privacy violations over time with changing mechanisms
Reducing sampling effort for sustained privacy monitoring of algorithms
Ensuring reliable privacy guarantees throughout algorithm deployment history
Innovation

Methods, ideas, or system contributions that make the work stand out.

Monitors privacy violations across algorithm development timeline
Reduces sampling efforts by leveraging deployment history data
Provides formal guarantees for sustained auditing reliability
πŸ”Ž Similar Papers
No similar papers found.