Decentralized Federated Learning by Partial Message Exchange

📅 2026-03-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes PaME, a novel algorithm for decentralized federated learning that addresses the challenges of data heterogeneity, high communication overhead, and the trade-off between privacy and convergence. Under mild assumptions—namely, local Lipschitz continuity of gradients and a doubly stochastic communication matrix—PaME enables partial message updates by exchanging randomly selected sparse coordinates between neighboring nodes. Theoretically, PaME is proven to achieve linear convergence, effectively mitigating the adverse effects of data heterogeneity. Empirical evaluations demonstrate that PaME consistently outperforms state-of-the-art decentralized learning algorithms in terms of communication efficiency, privacy preservation, and model accuracy.

Technology Category

Application Category

📝 Abstract
Decentralized federated learning (DFL) has emerged as a transformative server-free paradigm that enables collaborative learning over large-scale heterogeneous networks. However, it continues to face fundamental challenges, including data heterogeneity, restrictive assumptions for theoretical analysis, and degraded convergence when standard communication- or privacyenhancing techniques are applied. To overcome these drawbacks, this paper develops a novel algorithm, PaME (DFL by Partial Message Exchange). The central principle is to allow only randomly selected sparse coordinates to be exchanged between two neighbor nodes. Consequently, PaME achieves substantial reductions in communication costs while still preserving a high level of privacy, without sacrificing accuracy. Moreover, grounded in rigorous analysis, the algorithm is shown to converge at a linear rate under the gradient to be locally Lipschitz continuous and the communication matrix to be doubly stochastic. These two mild assumptions not only dispense with many restrictive conditions commonly imposed by existing DFL methods but also enables PaME to effectively address data heterogeneity. Furthermore, comprehensive numerical experiments demonstrate its superior performance compared with several representative decentralized learning algorithms.
Problem

Research questions and friction points this paper is trying to address.

Decentralized Federated Learning
Data Heterogeneity
Convergence Degradation
Communication Cost
Privacy Preservation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decentralized Federated Learning
Partial Message Exchange
Communication Efficiency
Data Heterogeneity
Linear Convergence
🔎 Similar Papers
No similar papers found.
S
Shan Sha
School of Mathematics and Statistics, Beijing Jiaotong University, Beijing, China
Shenglong Zhou
Shenglong Zhou
University of Science and Technology of China
Computer VisionImage RegistrationTransfer Learning
Xin Wang
Xin Wang
China Agricultural University
MechatronicsAutomationSensorsRobotics
L
Lingchen Kong
School of Mathematics and Statistics, Beijing Jiaotong University, Beijing, China
G
Geoffrey Ye Li
Department of Electrical and Electronic Engineering, Faculty of Engineering, Imperial College London, London, U.K.