A Tight Context-aware Privacy Bound for Histogram Publication

📅 2025-08-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses privacy guarantees for histogram publication under the Laplace mechanism. It proposes a context-aware privacy analysis framework based on pointwise maximal leakage (PML), departing from the distribution-agnostic assumptions of standard differential privacy. By incorporating prior knowledge of the true data distribution, the framework derives tighter privacy bounds: when histogram bin probabilities are bounded away from zero, strictly stronger privacy is achieved at the same noise level. Theoretical analysis and empirical evaluation demonstrate that the method significantly improves privacy precision without sacrificing utility, yielding a superior privacy–utility trade-off. The core contribution lies in systematically integrating data distribution knowledge into PML-based analysis, establishing a novel paradigm for context-sensitive privacy quantification.

Technology Category

Application Category

📝 Abstract
We analyze the privacy guarantees of the Laplace mechanism releasing the histogram of a dataset through the lens of pointwise maximal leakage (PML). While differential privacy is commonly used to quantify the privacy loss, it is a context-free definition that does not depend on the data distribution. In contrast, PML enables a more refined analysis by incorporating assumptions about the data distribution. We show that when the probability of each histogram bin is bounded away from zero, stronger privacy protection can be achieved for a fixed level of noise. Our results demonstrate the advantage of context-aware privacy measures and show that incorporating assumptions about the data can improve privacy-utility tradeoffs.
Problem

Research questions and friction points this paper is trying to address.

Analyzing privacy guarantees of Laplace mechanism for histogram publication
Incorporating data distribution assumptions via pointwise maximal leakage
Improving privacy-utility tradeoffs through context-aware privacy measures
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses pointwise maximal leakage for privacy analysis
Incorporates data distribution assumptions for refined analysis
Achieves stronger protection with bounded histogram probabilities
🔎 Similar Papers
No similar papers found.
Sara Saeidian
Sara Saeidian
Postdoc, KTH Royal Institute of Technology
Privacy-preserving learninginformation-theoretic privacy
A
Ata Yavuzyılmaz
Division of Information Science and Engineering (ISE), School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, 100 44 Stockholm, Sweden
L
Leonhard Grosse
Division of Information Science and Engineering (ISE), School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, 100 44 Stockholm, Sweden
G
Georg Schuppe
SEBx, SEB
Tobias J. Oechtering
Tobias J. Oechtering
KTH Royal Institute of Technology, Division Information Science and Engineering
Information TheoryPrivacy and SecurityLearning and Decision TheoryStatistical Signal ProcessingCommunications