Efficient Federated Learning with Encrypted Data Sharing for Data-Heterogeneous Edge Devices

📅 2025-06-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address high training latency, slow convergence, and degraded model performance in edge federated learning—caused by dynamic network topologies, physical distances among devices, and severe data heterogeneity—this paper proposes FedEDS, a novel framework. FedEDS introduces a lightweight, client-side data encryptor dynamically generated from stochastic layers of local models, enabling secure, low-overhead cross-device encrypted data sharing. This mechanism mitigates data heterogeneity while enhancing local training efficacy. Integrating edge-distributed architecture, federated collaborative optimization, and encrypted data augmentation, FedEDS improves model consistency without increasing communication overhead. Extensive experiments on multiple heterogeneous edge datasets demonstrate that FedEDS accelerates convergence by 37.2% on average and boosts final test accuracy by 2.1–5.8 percentage points over state-of-the-art federated learning baselines. The framework is particularly suited for latency-sensitive edge intelligence applications.

Technology Category

Application Category

📝 Abstract
As privacy protection gains increasing importance, more models are being trained on edge devices and subsequently merged into the central server through Federated Learning (FL). However, current research overlooks the impact of network topology, physical distance, and data heterogeneity on edge devices, leading to issues such as increased latency and degraded model performance. To address these issues, we propose a new federated learning scheme on edge devices that called Federated Learning with Encrypted Data Sharing(FedEDS). FedEDS uses the client model and the model's stochastic layer to train the data encryptor. The data encryptor generates encrypted data and shares it with other clients. The client uses the corresponding client's stochastic layer and encrypted data to train and adjust the local model. FedEDS uses the client's local private data and encrypted shared data from other clients to train the model. This approach accelerates the convergence speed of federated learning training and mitigates the negative impact of data heterogeneity, making it suitable for application services deployed on edge devices requiring rapid convergence. Experiments results show the efficacy of FedEDS in promoting model performance.
Problem

Research questions and friction points this paper is trying to address.

Addresses latency and performance issues in federated learning
Mitigates data heterogeneity impact on edge devices
Enhances convergence speed with encrypted data sharing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Encrypted data sharing among edge devices
Client model trains data encryptor
Combines private and shared encrypted data
🔎 Similar Papers
No similar papers found.
H
Hangyu Li
College of Intelligence and Computing, Tianjin University, Tianjin, China
H
Hongyue Wu
College of Intelligence and Computing, Tianjin University, Tianjin, China; State Key Lab. for Novel Software Technology, Nanjing University, Nanjing, China; Yunnan Key Lab. of Service Computing, Yunnan University of Finance and Economics, Kunming, China
Guodong Fan
Guodong Fan
Tianjin University
Service ComputingSoftware EngineeringLarge Language ModelsCombinatorial Optimization
Z
Zhen Zhang
College of Intelligence and Computing, Tianjin University, Tianjin, China
Shizhan Chen
Shizhan Chen
Associate Professor of Computer Science, Tianjin University
Service-oriented computingSocial networkingService-oriented architecture
Z
Zhiyong Feng
College of Intelligence and Computing, Tianjin University, Tianjin, China