๐ค AI Summary
This work addresses the challenge in federated learning where data heterogeneity across edge clients conflicts with stringent communication and computational constraints. To this end, the authors propose FedCSPACK, a novel method that integrates cosine similarityโdriven sparse parameter packing, a mask alignment mechanism, and a dual-weight aggregation strategy based on directional and distributional distances. This approach simultaneously reduces communication and computation overhead while enhancing model adaptability to heterogeneous data. Experimental results demonstrate that FedCSPACK outperforms ten state-of-the-art methods across four benchmark datasets, achieving high accuracy without compromising resource efficiency or model robustness.
๐ Abstract
Federated learning has drawn widespread interest from researchers, yet the data heterogeneity across edge clients remains a key challenge, often degrading model performance. Existing methods enhance model compatibility with data heterogeneity by splitting models and knowledge distillation. However, they neglect the insufficient communication bandwidth and computing power on the client, failing to strike an effective balance between addressing data heterogeneity and accommodating limited client resources. To tackle this limitation, we propose a personalized federated learning method based on cosine sparsification parameter packing and dual-weighted aggregation (FedCSPACK), which effectively leverages the limited client resources and reduces the impact of data heterogeneity on model performance. In FedCSPACK, the client packages model parameters and selects the most contributing parameter packages for sharing based on cosine similarity, effectively reducing bandwidth requirements. The client then generates a mask matrix anchored to the shared parameter package to improve the alignment and aggregation efficiency of sparse updates on the server. Furthermore, directional and distribution distance weights are embedded in the mask to implement a weighted-guided aggregation mechanism, enhancing the robustness and generalization performance of the global model. Extensive experiments across four datasets using ten state-of-the-art methods demonstrate that FedCSPACK effectively improves communication and computational efficiency while maintaining high model accuracy.