🤖 AI Summary
In distillation-based federated learning, redundant soft-label transmission incurs high communication overhead and degrades aggregation quality. To address this, we propose an efficient collaborative framework: (1) a synchronized soft-label caching mechanism that reuses historical predictions to minimize repeated transmissions; and (2) Enhanced Entropy-Reduction Aggregation (ERA), which dynamically sharpens soft labels and calibrates local predictions to improve model consistency under non-IID data. Our approach is the first to enable cross-round soft-label caching coupled with adaptive entropy-constrained aggregation. Extensive experiments on multiple non-IID benchmarks demonstrate a 50% reduction in communication cost while achieving significantly higher accuracy than state-of-the-art distillation-based federated learning methods. The implementation is publicly available.
📝 Abstract
Federated Learning (FL) enables collaborative model training across decentralized clients, enhancing privacy by keeping data local. Yet conventional FL, relying on frequent parameter-sharing, suffers from high communication overhead and limited model heterogeneity. Distillation-based FL approaches address these issues by sharing predictions (soft-labels) instead, but they often involve redundant transmissions across communication rounds, reducing efficiency. We propose SCARLET, a novel framework integrating synchronized soft-label caching and an enhanced Entropy Reduction Aggregation (Enhanced ERA) mechanism. SCARLET minimizes redundant communication by reusing cached soft-labels, achieving up to 50% reduction in communication costs compared to existing methods while maintaining accuracy. Enhanced ERA can be tuned to adapt to non-IID data variations, ensuring robust aggregation and performance in diverse client scenarios. Experimental evaluations demonstrate that SCARLET consistently outperforms state-of-the-art distillation-based FL methods in terms of accuracy and communication efficiency. The implementation of SCARLET is publicly available at https://github.com/kitsuyaazuma/SCARLET.