FedDTG: Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks

📅 2022-01-10
🏛️ arXiv.org
📈 Citations: 15
Influential: 1
📄 PDF
🤖 AI Summary
To address the challenge in federated learning where clients refuse to share model parameters due to privacy concerns—and where existing knowledge distillation methods rely on inaccessible proxy datasets—this paper proposes a parameter- and data-free federated mutual distillation framework. The method introduces a distributed three-player GAN architecture that enables implicit collaborative generation and cross-client knowledge alignment, achieving bidirectional knowledge transfer without any access to raw or proxy data. By integrating data-free knowledge distillation with federated cooperative optimization, the framework enhances global model generalization and robustness while preserving client-level personalization. Extensive experiments on standard vision benchmarks demonstrate that the proposed approach consistently outperforms state-of-the-art federated distillation methods across multiple metrics.
📝 Abstract
While existing federated learning approaches primarily focus on aggregating local models to construct a global model, in realistic settings, some clients may be reluctant to share their private models due to the inclusion of privacy-sensitive information. Knowledge distillation, which can extract model knowledge without accessing model parameters, is well-suited for this federated scenario. However, most distillation methods in federated learning (federated distillation) require a proxy dataset, which is difficult to obtain in the real world. Therefore, in this paper, we introduce a distributed three-player Generative Adversarial Network (GAN) to implement data-free mutual distillation and propose an effective method called FedDTG. We confirmed that the fake samples generated by GAN can make federated distillation more efficient and robust. Additionally, the distillation process between clients can deliver good individual client performance while simultaneously acquiring global knowledge and protecting data privacy. Our extensive experiments on benchmark vision datasets demonstrate that our method outperforms other federated distillation algorithms in terms of generalization.
Problem

Research questions and friction points this paper is trying to address.

Enables federated learning without sharing private client models
Achieves data-free knowledge distillation using GANs
Improves model performance while preserving data privacy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated data-free knowledge distillation
Three-player GAN for mutual distillation
Generates fake samples for robust distillation
🔎 Similar Papers
Z
Zhenyuan Zhang
Zhejiang Rural Commercial United Bank Co.,Ltd., Hangzhou, China