🤖 AI Summary
To address the challenges of resource-constrained computing on low Earth orbit (LEO) satellites and highly dynamic, frequently interrupted satellite-to-ground links, this paper proposes the first asynchronous space–ground collaborative split federated learning (SFL) framework. The framework synergistically integrates split learning’s (SL) layer-wise model offloading with federated learning’s (FL) distributed parameter aggregation. Leveraging satellite pass durations, it enables asynchronous local training during link outages and transmits only lightweight submodels to ground stations for aggregation upon reconnection. Simulation results—based on real Starlink bandwidth traces—demonstrate that the proposed method achieves model accuracy comparable to standard SL, while significantly improving local training utilization during disconnections, accelerating overall convergence, and reducing communication overhead. These gains effectively accommodate LEO networks’ stringent constraints: high mobility, ultra-low latency requirements, and limited onboard computational capacity.
📝 Abstract
Recently, the rapid development of LEO satellite networks spurs another widespread concern-data processing at satellites. However, achieving efficient computation at LEO satellites in highly dynamic satellite networks is challenging and remains an open problem when considering the constrained computation capability of LEO satellites. For the first time, we propose a novel distributed learning framework named SFL-LEO by combining Federated Learning (FL) with Split Learning (SL) to accommodate the high dynamics of LEO satellite networks and the constrained computation capability of LEO satellites by leveraging the periodical orbit traveling feature. The proposed scheme allows training locally by introducing an asynchronous training strategy, i.e., achieving local update when LEO satellites disconnect with the ground station, to provide much more training space and thus increase the training performance. Meanwhile, it aggregates client-side sub-models at the ground station and then distributes them to LEO satellites by borrowing the idea from the federated learning scheme. Experiment results driven by satellite-ground bandwidth measured in Starlink demonstrate that SFL-LEO provides a similar accuracy performance with the conventional SL scheme because it can perform local training even within the disconnection duration.