Optimal Asynchronous Stochastic Nonconvex Optimization under Heavy-Tailed Noise

📅 2026-01-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges posed by heavy-tailed gradient noise and highly heterogeneous computation times across workers in asynchronous stochastic nonconvex optimization. The authors propose a momentum-based asynchronous normalized stochastic gradient descent algorithm that, under the mild assumption of bounded $p$-th central moments of the gradient noise for $p \in (1,2]$, achieves optimal time complexity in arbitrarily heterogeneous computing environments. Theoretical analysis establishes both convergence and optimality of the proposed method, while numerical experiments further demonstrate its robustness and effectiveness in settings with heavy-tailed noise distributions.

Technology Category

Application Category

📝 Abstract
This paper considers the problem of asynchronous stochastic nonconvex optimization with heavy-tailed gradient noise and arbitrarily heterogeneous computation times across workers. We propose an asynchronous normalized stochastic gradient descent algorithm with momentum. The analysis show that our method achieves the optimal time complexity under the assumption of bounded $p$th-order central moment with $p\in(1,2]$. We also provide numerical experiments to show the effectiveness of proposed method.
Problem

Research questions and friction points this paper is trying to address.

asynchronous optimization
stochastic nonconvex optimization
heavy-tailed noise
heterogeneous computation times
Innovation

Methods, ideas, or system contributions that make the work stand out.

asynchronous optimization
heavy-tailed noise
normalized SGD
nonconvex optimization
optimal time complexity
🔎 Similar Papers
No similar papers found.