FedCCL: Federated Clustered Continual Learning Framework for Privacy-focused Energy Forecasting

📅 2025-04-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address accuracy degradation, adaptation latency, and high coordination overhead in federated learning—caused by heterogeneous data distributions, disparate client computational capabilities, and dynamic client availability—this paper proposes a privacy-preserving training framework tailored for distributed energy forecasting. The method innovatively integrates static clustering-based pretraining with asynchronous FedAvg, establishing a three-tier model topology (global–cluster–local) that enables zero-shot rapid adaptation for new clients and robustness to client disconnections. Crucially, it eliminates runtime dynamic clustering, substantially reducing communication and computational overhead. Evaluated on a Central-European photovoltaic dataset, the framework achieves a mean prediction error of 3.93% ± 0.21%; deploying to new sites incurs only a marginal 0.14-percentage-point performance drop. The approach thus delivers high prediction accuracy, low coordination cost, and strong privacy guarantees.

Technology Category

Application Category

📝 Abstract
Privacy-preserving distributed model training is crucial for modern machine learning applications, yet existing Federated Learning approaches struggle with heterogeneous data distributions and varying computational capabilities. Traditional solutions either treat all participants uniformly or require costly dynamic clustering during training, leading to reduced efficiency and delayed model specialization. We present FedCCL (Federated Clustered Continual Learning), a framework specifically designed for environments with static organizational characteristics but dynamic client availability. By combining static pre-training clustering with an adapted asynchronous FedAvg algorithm, FedCCL enables new clients to immediately profit from specialized models without prior exposure to their data distribution, while maintaining reduced coordination overhead and resilience to client disconnections. Our approach implements an asynchronous Federated Learning protocol with a three-tier model topology - global, cluster-specific, and local models - that efficiently manages knowledge sharing across heterogeneous participants. Evaluation using photovoltaic installations across central Europe demonstrates that FedCCL's location-based clustering achieves an energy prediction error of 3.93% (+-0.21%), while maintaining data privacy and showing that the framework maintains stability for population-independent deployments, with 0.14 percentage point degradation in performance for new installations. The results demonstrate that FedCCL offers an effective framework for privacy-preserving distributed learning, maintaining high accuracy and adaptability even with dynamic participant populations.
Problem

Research questions and friction points this paper is trying to address.

Addresses heterogeneous data in federated learning for energy forecasting
Reduces inefficiency in dynamic clustering during model training
Ensures privacy and stability with static organizational characteristics
Innovation

Methods, ideas, or system contributions that make the work stand out.

Static pre-training clustering for immediate model specialization
Asynchronous FedAvg algorithm for reduced coordination overhead
Three-tier model topology for efficient knowledge sharing
🔎 Similar Papers
No similar papers found.