Differentially Private Two-Stage Gradient Descent for Instrumental Variable Regression

📅 2025-09-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the instrumental variable regression (IVAR) problem under differential privacy constraints, where conventional methods suffer from privacy leakage and statistical inefficiency due to direct use of sensitive covariates and instruments. We propose a noisy two-stage gradient descent algorithm that injects calibrated noise—designed via the ρ-zCDP mechanism—into gradient updates. To our knowledge, this is the first method for linear IV models that simultaneously provides rigorous privacy guarantees and provable finite-sample convergence rates. Theoretical analysis precisely characterizes the trade-off among privacy budget ε, sample size n, and iteration complexity. Empirical evaluation on both synthetic and real-world datasets demonstrates superior privacy-accuracy trade-offs compared to existing approaches.

Technology Category

Application Category

📝 Abstract
We study instrumental variable regression (IVaR) under differential privacy constraints. Classical IVaR methods (like two-stage least squares regression) rely on solving moment equations that directly use sensitive covariates and instruments, creating significant risks of privacy leakage and posing challenges in designing algorithms that are both statistically efficient and differentially private. We propose a noisy two-state gradient descent algorithm that ensures $ρ$-zero-concentrated differential privacy by injecting carefully calibrated noise into the gradient updates. Our analysis establishes finite-sample convergence rates for the proposed method, showing that the algorithm achieves consistency while preserving privacy. In particular, we derive precise bounds quantifying the trade-off among privacy parameters, sample size, and iteration-complexity. To the best of our knowledge, this is the first work to provide both privacy guarantees and provable convergence rates for instrumental variable regression in linear models. We further validate our theoretical findings with experiments on both synthetic and real datasets, demonstrating that our method offers practical accuracy-privacy trade-offs.
Problem

Research questions and friction points this paper is trying to address.

Achieving differential privacy in instrumental variable regression methods
Balancing statistical efficiency with privacy protection in sensitive data
Developing private gradient descent with provable convergence and privacy guarantees
Innovation

Methods, ideas, or system contributions that make the work stand out.

Noisy two-stage gradient descent algorithm
Calibrated noise injection for privacy
Finite-sample convergence rate guarantees
🔎 Similar Papers
No similar papers found.