Learn More by Using Less: Distributed Learning with Energy-Constrained Devices

📅 2024-12-03
🏛️ 2025 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address high client dropout rates and unstable convergence in federated learning (FL) on battery-constrained devices caused by system heterogeneity, this paper proposes an energy-aware adaptive data scheduling framework. Methodologically, it extends FedAvg with three integrated components: energy-aware client selection, elastic local training load scaling, and heterogeneity-robust aggregation. Its key contribution lies in explicitly modeling client battery lifetime as a convergence constraint in FL and designing a dynamic local data sampling ratio control mechanism to jointly optimize participation rate and energy consumption. Experiments on CIFAR-10 and CIFAR-100 demonstrate that the framework improves test accuracy by up to 8.2% over FedAvg, reduces client dropout by 67%, and maintains stable convergence under severe data heterogeneity and low-battery conditions.

Technology Category

Application Category

📝 Abstract
Federated Learning (FL) has emerged as a solution for distributed model training across decentralized, privacy-preserving devices, but the different energy capacities of participating devices (system heterogeneity) constrain real-world implementations. These energy limitations not only reduce model accuracy but also increase dropout rates, impacting on convergence in practical FL deployments. In this work, we propose LeanFed, an energy-aware FL framework designed to optimize client selection and training workloads on battery-constrained devices. LeanFed leverages adaptive data usage by dynamically adjusting the fraction of local data each device utilizes during training, thereby maximizing device participation across communication rounds while ensuring they do not run out of battery during the process. We rigorously evaluate LeanFed against traditional FedAvg on CIFAR-10 and CIFAR-100 datasets, simulating various levels of data heterogeneity and device participation rates. Results show that LeanFed consistently enhances model accuracy and stability, particularly in settings with high data heterogeneity and limited battery life, by mitigating client dropout and extending device availability. This approach demonstrates the potential of energy-efficient, privacy-preserving FL in real-world, large-scale applications, setting a foundation for robust and sustainable pervasive AI on resource-constrained networks.
Problem

Research questions and friction points this paper is trying to address.

Optimizing client selection for energy-constrained federated learning devices
Reducing client dropout rates to improve model convergence stability
Adapting local data usage to extend device battery life
Innovation

Methods, ideas, or system contributions that make the work stand out.

Energy-aware client selection for battery-constrained devices
Dynamic local data usage adjustment during training
Mitigating client dropout to enhance model stability
🔎 Similar Papers
No similar papers found.
R
Roberto Pereira
Centre Tecnol`ogic de Telecomunicacions de Catalunya (CTTC / CERCA)
C
Cristian J. Vaca-Rubio
Centre Tecnol`ogic de Telecomunicacions de Catalunya (CTTC / CERCA)
Luis Blanco
Luis Blanco
Researcher, CTTC
Wireless communicationsB5G/6GAIIoT/Satellite communicationsStatistical signal processing