Tackling Resource-Constrained and Data-Heterogeneity in Federated Learning with Double-Weight Sparse Pack

๐Ÿ“… 2026-01-05
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the challenge in federated learning where data heterogeneity across edge clients conflicts with stringent communication and computational constraints. To this end, the authors propose FedCSPACK, a novel method that integrates cosine similarityโ€“driven sparse parameter packing, a mask alignment mechanism, and a dual-weight aggregation strategy based on directional and distributional distances. This approach simultaneously reduces communication and computation overhead while enhancing model adaptability to heterogeneous data. Experimental results demonstrate that FedCSPACK outperforms ten state-of-the-art methods across four benchmark datasets, achieving high accuracy without compromising resource efficiency or model robustness.

Technology Category

Application Category

๐Ÿ“ Abstract
Federated learning has drawn widespread interest from researchers, yet the data heterogeneity across edge clients remains a key challenge, often degrading model performance. Existing methods enhance model compatibility with data heterogeneity by splitting models and knowledge distillation. However, they neglect the insufficient communication bandwidth and computing power on the client, failing to strike an effective balance between addressing data heterogeneity and accommodating limited client resources. To tackle this limitation, we propose a personalized federated learning method based on cosine sparsification parameter packing and dual-weighted aggregation (FedCSPACK), which effectively leverages the limited client resources and reduces the impact of data heterogeneity on model performance. In FedCSPACK, the client packages model parameters and selects the most contributing parameter packages for sharing based on cosine similarity, effectively reducing bandwidth requirements. The client then generates a mask matrix anchored to the shared parameter package to improve the alignment and aggregation efficiency of sparse updates on the server. Furthermore, directional and distribution distance weights are embedded in the mask to implement a weighted-guided aggregation mechanism, enhancing the robustness and generalization performance of the global model. Extensive experiments across four datasets using ten state-of-the-art methods demonstrate that FedCSPACK effectively improves communication and computational efficiency while maintaining high model accuracy.
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
Data Heterogeneity
Resource Constraints
Communication Efficiency
Client Resources
Innovation

Methods, ideas, or system contributions that make the work stand out.

federated learning
data heterogeneity
cosine sparsification
parameter packing
dual-weighted aggregation
๐Ÿ”Ž Similar Papers
Q
Qiantao Yang
School of Cyber Science and Engineering, Southeast University, China
L
Liquan Chen
School of Cyber Science and Engineering, Southeast University, China; Purple Mountain Laboratories, China
Mingfu Xue
Mingfu Xue
Full Professor, East China Normal University
AI SecurityIntellectual Properties Protection for Deep Learning modelsHardware Security
Songze Li
Songze Li
Professor, Southeast University
AI security and privacyBlockchain securityInformation theory