Enhancing Gradient Variance and Differential Privacy in Quantum Federated Learning

📅 2025-09-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In quantum federated learning (QFL), deploying quantum neural networks (QNNs) as local models faces three key challenges: insufficient gradient variance leading to premature convergence at local minima, weak differential privacy (DP) guarantees risking gradient leakage, and severe intermediate quantum noise degrading both model accuracy and convergence. To address these, we propose a DP-compliant QFL framework integrating adaptive noise generation. Our approach introduces (i) the first intermediate quantum noise estimation and suppression strategy; (ii) a gradient-aware adaptive noise injection mechanism that mitigates vanishing gradient variance while enhancing robustness; and (iii) secure aggregation under strict $(varepsilon,delta)$-differential privacy. Experiments on MNIST and CIFAR-10 achieve 98.47% and 83.85% test accuracy, respectively—outperforming state-of-the-art QFL methods—with 32% faster convergence and 41% reduced communication overhead.

Technology Category

Application Category

📝 Abstract
Upon integrating Quantum Neural Network (QNN) as the local model, Quantum Federated Learning (QFL) has recently confronted notable challenges. Firstly, exploration is hindered over sharp minima, decreasing learning performance. Secondly, the steady gradient descent results in more stable and predictable model transmissions over wireless channels, making the model more susceptible to attacks from adversarial entities. Additionally, the local QFL model is vulnerable to noise produced by the quantum device's intermediate noise states, since it requires the use of quantum gates and circuits for training. This local noise becomes intertwined with learning parameters during training, impairing model precision and convergence rate. To address these issues, we propose a new QFL technique that incorporates differential privacy and introduces a dedicated noise estimation strategy to quantify and mitigate the impact of intermediate quantum noise. Furthermore, we design an adaptive noise generation scheme to alleviate privacy threats associated with the vanishing gradient variance phenomenon of QNN and enhance robustness against device noise. Experimental results demonstrate that our algorithm effectively balances convergence, reduces communication costs, and mitigates the adverse effects of intermediate quantum noise while maintaining strong privacy protection. Using real-world datasets, we achieved test accuracy of up to 98.47% for the MNIST dataset and 83.85% for the CIFAR-10 dataset while maintaining fast execution times.
Problem

Research questions and friction points this paper is trying to address.

Addressing sharp minima hindrance in quantum federated learning
Mitigating vulnerability to adversarial attacks from stable gradients
Reducing quantum device noise impact on model precision
Innovation

Methods, ideas, or system contributions that make the work stand out.

Incorporates differential privacy for enhanced security
Introduces noise estimation to mitigate quantum noise
Uses adaptive noise generation to improve robustness
🔎 Similar Papers
No similar papers found.
D
Duc-Thien Phan
College of Computer Science and Electronic Engineering, Hunan University, Hunan 410082, China
Minh-Duong Nguyen
Minh-Duong Nguyen
College of Engineering and Computer Science, VinUniversity
Federated LearningContinual LearningMTLDomain GeneralizationInformation Theory
Q
Quoc-Viet Pham
School of Computer Science and Statistics, Trinity College Dublin, Dublin 2, D02 PN40, Ireland
H
Huilong Pi
College of Computer Science and Electronic Engineering, Hunan University, Hunan 410082, China