SyncFed: Time-Aware Federated Learning through Explicit Timestamping and Synchronization

📅 2025-06-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Federated learning (FL) in large-scale geographically distributed settings suffers from network latency, clock asynchrony, and inconsistent client update freshness, leading to unstable model convergence and misaligned contribution assessment. To address the lack of explicit quantification of update staleness in existing methods, this paper proposes the first NTP-based explicit temporal semantics framework for FL. It integrates high-precision global timestamps into client updates, formalizes a numerical staleness metric, and introduces a temporal-aware weighted aggregation mechanism—thereby overcoming the temporal blindness inherent in conventional round-driven paradigms. Extensive experiments on a cross-regional distributed FL testbed demonstrate that our approach significantly improves model accuracy and information freshness while ensuring temporal consistency in global model evolution. It consistently outperforms time-agnostic baselines across all evaluated metrics.

Technology Category

Application Category

📝 Abstract
As Federated Learning (FL) expands to larger and more distributed environments, consistency in training is challenged by network-induced delays, clock unsynchronicity, and variability in client updates. This combination of factors may contribute to misaligned contributions that undermine model reliability and convergence. Existing methods like staleness-aware aggregation and model versioning address lagging updates heuristically, yet lack mechanisms to quantify staleness, especially in latency-sensitive and cross-regional deployments. In light of these considerations, we introduce emph{SyncFed}, a time-aware FL framework that employs explicit synchronization and timestamping to establish a common temporal reference across the system. Staleness is quantified numerically based on exchanged timestamps under the Network Time Protocol (NTP), enabling the server to reason about the relative freshness of client updates and apply temporally informed weighting during aggregation. Our empirical evaluation on a geographically distributed testbed shows that, under emph{SyncFed}, the global model evolves within a stable temporal context, resulting in improved accuracy and information freshness compared to round-based baselines devoid of temporal semantics.
Problem

Research questions and friction points this paper is trying to address.

Addresses network delays and clock unsynchronicity in Federated Learning
Quantifies staleness of client updates using timestamping and NTP
Improves model accuracy and freshness through time-aware synchronization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Explicit timestamping for synchronization
Quantifies staleness using NTP timestamps
Temporally informed weighting in aggregation
🔎 Similar Papers