Particle Filter for Bayesian Inference on Privatized Data

📅 2025-05-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges in Bayesian inference under differential privacy (DP)—including posterior distortion due to noise perturbation, slow convergence, poor mixing, and low acceptance rates in existing methods—this paper introduces, for the first time, particle filtering into the private Bayesian inference framework, proposing a novel DP-aware particle filtering algorithm. The method directly approximates the posterior distribution over noisy, privatized summary statistics without requiring strong model assumptions or restrictive prior specifications. It achieves consistent estimation, quantifies Monte Carlo error, and enables construction of asymptotically valid confidence intervals. Theoretical analysis, extensive simulations, and empirical validation on the 2021 Canadian Census microdata demonstrate that the algorithm significantly improves sampling efficiency and convergence speed, overcoming key performance bottlenecks inherent in current MCMC-based private inference approaches.

Technology Category

Application Category

📝 Abstract
Differential Privacy (DP) is a probabilistic framework that protects privacy while preserving data utility. To protect the privacy of the individuals in the dataset, DP requires adding a precise amount of noise to a statistic of interest; however, this noise addition alters the resulting sampling distribution, making statistical inference challenging. One of the main DP goals in Bayesian analysis is to make statistical inference based on the private posterior distribution. While existing methods have strengths in specific conditions, they can be limited by poor mixing, strict assumptions, or low acceptance rates. We propose a novel particle filtering algorithm, which features (i) consistent estimates, (ii) Monte Carlo error estimates and asymptotic confidence intervals, (iii) computational efficiency, and (iv) accommodation to a wide variety of priors, models, and privacy mechanisms with minimal assumptions. We empirically evaluate our algorithm through a variety of simulation settings as well as an application to a 2021 Canadian census dataset, demonstrating the efficacy and adaptability of the proposed sampler.
Problem

Research questions and friction points this paper is trying to address.

Performing Bayesian inference on differentially private data
Addressing challenges from noise-altered sampling distributions
Overcoming limitations of existing DP inference methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Particle filtering for private Bayesian inference
Monte Carlo error estimates with confidence intervals
Efficient computation with minimal assumptions
🔎 Similar Papers
No similar papers found.
Y
Yu-Wei Chen
Department of Statistics, Purdue University, West Lafayette, IN 47907
P
Pranav Sanghi
Department of Computer Science, Purdue University, West Lafayette, IN 47907
Jordan Awan
Jordan Awan
Assistant Professor, University of Pittsburgh
Differential PrivacyStatisticsVoice AnalysisDiscrete Mathematics