Momentum-based Accelerated Algorithm for Distributed Optimization under Sector-Bound Nonlinearity

πŸ“… 2025-06-28
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This paper addresses distributed optimization over dynamic directed networks with sector-bounded nonlinear communication and local nonconvex objective functions. We propose a momentum-accelerated consensus algorithm that integrates gradient tracking with the heavy-ball method, coupled with a robust communication protocol designed for weight-balanced directed graphs. Theoretically, we establish convergence to first-order stationary points under nonlinear information exchange and time-varying topologies, with a convergence rate superior to standard distributed gradient methods. Key contributions include: (1) the first incorporation of the heavy-ball momentum mechanism into the gradient-tracking framework under nonlinear communication constraints; (2) support for general sector-bounded nonlinear mappings without requiring exact linearization; and (3) strong robustness against link failures and packet losses, guaranteed via perturbation analysis and spectral characterization of the consensus dynamics. Experiments demonstrate the algorithm’s efficiency and stability in communication-constrained scenarios.

Technology Category

Application Category

πŸ“ Abstract
Distributed optimization advances centralized machine learning methods by enabling parallel and decentralized learning processes over a network of computing nodes. This work provides an accelerated consensus-based distributed algorithm for locally non-convex optimization using the gradient-tracking technique. The proposed algorithm (i) improves the convergence rate by adding momentum towards the optimal state using the heavy-ball method, while (ii) addressing general sector-bound nonlinearities over the information-sharing network. The link nonlinearity includes any sign-preserving odd sector-bound mapping, for example, log-scale data quantization or clipping in practical applications. For admissible momentum and gradient-tracking parameters, using perturbation theory and eigen-spectrum analysis, we prove convergence even in the presence of sector-bound nonlinearity and for locally non-convex cost functions. Further, in contrast to most existing weight-stochastic algorithms, we adopt weight-balanced (WB) network design. This WB design and perturbation-based analysis allow to handle dynamic directed network of agents to address possible time-varying setups due to link failures or packet drops.
Problem

Research questions and friction points this paper is trying to address.

Accelerates distributed optimization for non-convex problems
Handles sector-bound nonlinearities in information-sharing networks
Supports dynamic directed networks with weight-balanced design
Innovation

Methods, ideas, or system contributions that make the work stand out.

Momentum-based acceleration for faster convergence
Gradient-tracking handles sector-bound nonlinearities
Weight-balanced design for dynamic directed networks
πŸ”Ž Similar Papers
No similar papers found.