VFOG: Variance-Reduced Fast Optimistic Gradient Methods for a Class of Nonmonotone Generalized Equations

📅 2025-08-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of solving generalized equations involving non-monotone operators in data-driven settings. We propose the first optimistic gradient-based algorithmic framework that integrates Nesterov acceleration with variance reduction. The framework unifies major stochastic estimators—including loopless SVRG, SAGA, and SARAH—and supports both unbiased and biased gradient estimates, as well as mini-batching and control variates. Theoretically, we establish, for the first time under non-monotonicity, an $O(1/k^2)$ expected convergence rate for the objective residual and prove almost-sure convergence of the iterate sequence to a solution. Moreover, the convergence complexity of the squared residual norm substantially improves upon existing non-accelerated methods. Numerical experiments demonstrate significant gains in both convergence speed and computational efficiency achieved by the proposed acceleration strategy.

Technology Category

Application Category

📝 Abstract
We develop a novel optimistic gradient-type algorithmic framework, combining both Nesterov's acceleration and variance-reduction techniques, to solve a class of generalized equations involving possibly nonmonotone operators in data-driven applications. Our framework covers a wide class of stochastic variance-reduced schemes, including mini-batching, and control variate unbiased and biased estimators. We establish that our method achieves $mathcal{O}(1/k^2)$ convergence rates in expectation on the squared norm of residual under the Lipschitz continuity and a ``co-hypomonotonicity-type'' assumptions, improving upon non-accelerated counterparts by a factor of $1/k$. We also prove faster $o(1/k^2)$ convergence rates, both in expectation and almost surely. In addition, we show that the sequence of iterates of our method almost surely converges to a solution of the underlying problem. We demonstrate the applicability of our method using general error bound criteria, covering mini-batch stochastic estimators as well as three well-known control variate estimators: loopless SVRG, SAGA, and loopless SARAH, for which the last three variants attain significantly better oracle complexity compared to existing methods. We validate our framework and theoretical results through two numerical examples. The preliminary results illustrate promising performance of our accelerated method over its non-accelerated counterparts.
Problem

Research questions and friction points this paper is trying to address.

Solves nonmonotone generalized equations using optimistic gradient methods
Improves convergence rates for stochastic variance-reduced optimization schemes
Addresses data-driven applications with accelerated algorithmic framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

Novel optimistic gradient framework with acceleration
Variance-reduction techniques for nonmonotone operators
Mini-batching and control variate estimators integration
🔎 Similar Papers
No similar papers found.
Quoc Tran-Dinh
Quoc Tran-Dinh
Department of Statistics and Operations Research, UNC
convex optimizationnonlinear programmingoptimization for machine learning
N
Nghia Nguyen-Trung
Department of Statistics and Operations Research, The University of North Carolina at Chapel Hill