Differential privacy from axioms

📅 2025-11-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates whether privacy definitions weaker than differential privacy (DP) can retain practical utility. Specifically, it addresses the fundamental question of whether relaxing worst-case privacy guarantees can improve utility or applicability. Method: The authors identify and formalize four minimal, intuitively necessary axioms for any reasonable privacy measure: preprocessing invariance, prohibition of blatantly non-private mechanisms, strong composability, and linear scalability. Contribution/Results: They prove that any privacy measure satisfying this minimal complete axiom set is statistically equivalent to DP; conversely, dropping any single axiom yields a counterexample violating equivalence. This constitutes the first axiomatic characterization establishing DP’s indispensability—not as an arbitrarily conservative design choice, but as the unique measure satisfying natural privacy intuitions and compositional requirements. The result formally confirms DP’s foundational role in privacy-preserving data analysis.

Technology Category

Application Category

📝 Abstract
Differential privacy (DP) is the de facto notion of privacy both in theory and in practice. However, despite its popularity, DP imposes strict requirements which guard against strong worst-case scenarios. For example, it guards against seemingly unrealistic scenarios where an attacker has full information about all but one point in the data set, and still nothing can be learned about the remaining point. While preventing such a strong attack is desirable, many works have explored whether average-case relaxations of DP are easier to satisfy [HWR13,WLF16,BF16,LWX23]. In this work, we are motivated by the question of whether alternate, weaker notions of privacy are possible: can a weakened privacy notion still guarantee some basic level of privacy, and on the other hand, achieve privacy more efficiently and/or for a substantially broader set of tasks? Our main result shows the answer is no: even in the statistical setting, any reasonable measure of privacy satisfying nontrivial composition is equivalent to DP. To prove this, we identify a core set of four axioms or desiderata: pre-processing invariance, prohibition of blatant non-privacy, strong composition, and linear scalability. Our main theorem shows that any privacy measure satisfying our axioms is equivalent to DP, up to polynomial factors in sample complexity. We complement this result by showing our axioms are minimal: removing any one of our axioms enables ill-behaved measures of privacy.
Problem

Research questions and friction points this paper is trying to address.

Explores whether weaker privacy notions than differential privacy exist.
Investigates if alternative privacy measures can be more efficient or broader.
Proves any reasonable privacy measure with composition is equivalent to differential privacy.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Weakened privacy notions equivalent to differential privacy
Four core axioms define privacy measure requirements
Axioms ensure privacy with polynomial sample complexity factors
🔎 Similar Papers
No similar papers found.
Guy Blanc
Guy Blanc
Stanford University
W
William Pires
Columbia University
T
Toniann Pitassi
Columbia University