Federated Learning of Quantile Inference under Local Differential Privacy

📅 2025-09-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses quantile inference in federated learning under local differential privacy (LDP). To handle data heterogeneity and personalized privacy budgets, we propose a communication-efficient and statistically optimal framework: a robust estimator based on local stochastic gradient descent (Local SGD) for the nonsmooth quantile loss; the first asymptotic normality and functional central limit theorem for such estimators under LDP constraints; and a self-normalized inference procedure that constructs confidence intervals without auxiliary parameter estimation. Theoretically, the estimator achieves the minimax optimal statistical efficiency under LDP. Empirically, it attains high estimation accuracy, strong privacy protection (with ε ≤ 2), and low communication overhead on both synthetic and real-world datasets.

Technology Category

Application Category

📝 Abstract
In this paper, we investigate federated learning for quantile inference under local differential privacy (LDP). We propose an estimator based on local stochastic gradient descent (SGD), whose local gradients are perturbed via a randomized mechanism with global parameters, making the procedure tolerant of communication and storage constraints without compromising statistical efficiency. Although the quantile loss and its corresponding gradient do not satisfy standard smoothness conditions typically assumed in existing literature, we establish asymptotic normality for our estimator as well as a functional central limit theorem. The proposed method accommodates data heterogeneity and allows each server to operate with an individual privacy budget. Furthermore, we construct confidence intervals for the target value through a self-normalization approach, thereby circumventing the need to estimate additional nuisance parameters. Extensive numerical experiments and real data application validate the theoretical guarantees of the proposed methodology.
Problem

Research questions and friction points this paper is trying to address.

Federated quantile inference under local differential privacy
Handling non-smooth quantile loss with asymptotic normality
Constructing confidence intervals without nuisance parameter estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated learning with local differential privacy
Local SGD with perturbed gradients mechanism
Self-normalization for confidence intervals construction
🔎 Similar Papers
L
Leheng Cai
Department of Statistics and Data Science, Tsinghua University
Q
Qirui Hu
School of Statistics and Data Science, Shanghai University of Finance and Economics
Shuyuan Wu
Shuyuan Wu
School of Statistics and Data Science, Shanghai University of Finance and Economics
Large Dataset AnalysisSubsamplingDistributed Computing