🤖 AI Summary
To address the communication latency caused by uploading high-dimensional model parameters in over-the-air federated learning (AirComp-FL), this paper proposes Segmented Over-the-Air Transmission (SegOTA). The method partitions model parameters into segments and integrates device grouping with localized segment selection, enabling each device to transmit only a subset of parameter segments per round. It jointly optimizes grouping strategies and multi-antenna beamforming, deriving a closed-form solution amenable to real-time implementation. Theoretically, we establish an upper bound on the optimality gap for convergence. Simulation results demonstrate that SegOTA accelerates convergence by over 2.1× and reduces end-to-end communication latency by 63% compared to full-model AirComp-FL. Key contributions include: (i) the first incorporation of parameter segmentation into AirComp-FL; (ii) a joint grouping–beamforming optimization framework yielding a tractable closed-form solution; and (iii) rigorous convergence analysis with quantifiable performance gains.
📝 Abstract
Federated learning (FL) with over-the-air computation efficiently utilizes the communication resources, but it can still experience significant latency when each device transmits a large number of model parameters to the server. This paper proposes the Segmented Over-The-Air (SegOTA) method for FL, which reduces latency by partitioning devices into groups and letting each group transmit only one segment of the model parameters in each communication round. Considering a multi-antenna server, we model the SegOTA transmission and reception process to establish an upper bound on the expected model learning optimality gap. We minimize this upper bound, by formulating the per-round online optimization of device grouping and joint transmit-receive beamforming, for which we derive efficient closed-form solutions. Simulation results show that our proposed SegOTA substantially outperforms the conventional full-model OTA approach and other common alternatives.