π€ AI Summary
Federated learning (FL) faces dual challenges of high communication overhead and degraded model performance in large-scale, heterogeneous client settings with non-IID data and models. To address these, we propose FedLoRUβa novel FL framework that first identifies the higher Hessian rank structure inherent in client gradients compared to server gradients. Leveraging this insight, FedLoRU introduces a low-rank update mechanism: clients optimize within constrained low-rank subspaces, inducing implicit regularization; the server reconstructs the global model via rank-accumulating aggregation. The framework supports multi-level and multi-branch low-rank updates to accommodate system and statistical heterogeneity. We provide theoretical analysis proving that low-rank updates are equivalent to strong implicit regularization. Extensive experiments demonstrate that FedLoRU reduces communication costs by up to 90% while matching the accuracy of full-rank baselines and outperforming state-of-the-art FL methods in robustness.
π Abstract
Federated Learning (FL) faces significant challenges related to communication efficiency and heterogeneity. To address these issues, we explore the potential of using low-rank updates. Our theoretical analysis reveals that client's loss exhibits a higher rank structure (gradients span higher rank subspace of Hessian) compared to the server's loss. Based on this insight, we hypothesize that constraining client-side optimization to a low-rank subspace could provide an implicit regularization effect. Consequently, we propose FedLoRU, a general low-rank update framework for federated learning. Our framework enforces low-rank client-side updates and accumulates these updates to form a higher-rank model. Additionally, variants of FedLoRU can adapt to environments with statistical and model heterogeneity by employing multiple or hierarchical low-rank updates. Experimental results demonstrate that FedLoRU performs comparably to full-rank algorithms and exhibits robustness to heterogeneous and large numbers of clients.