🤖 AI Summary
This work addresses the Minty variational inequality problem under monotone Lipschitz operators and proposes ADUCA, a cyclic block-coordinate method that operates without prior knowledge of global or block-wise Lipschitz constants. The algorithm performs a single line search at initialization and leverages operator evaluations delayed by one full cycle for updates, rendering it naturally amenable to parallel and distributed implementations. ADUCA is the first cyclic algorithm to achieve parameter-free operation under weak synchronization assumptions while incorporating a delayed-update mechanism. It attains near-optimal oracle complexity—O(1/ε) in the monotone setting and O(log²(1/ε)) under strong monotonicity—without requiring any tuning of hyperparameters.
📝 Abstract
Cyclic block coordinate methods are a fundamental class of first-order algorithms, widely used in practice for their simplicity and strong empirical performance. Yet, their theoretical behavior remains challenging to explain, and setting their step sizes -- beyond classical coordinate descent for minimization -- typically requires careful tuning or line-search machinery. In this work, we develop $\texttt{ADUCA}$ (Adaptive Delayed-Update Cyclic Algorithm), a cyclic algorithm addressing a broad class of Minty variational inequalities with monotone Lipschitz operators. $\texttt{ADUCA}$ is parameter-free: it requires no global or block-wise Lipschitz constants and uses no per-epoch line search, except at initialization. A key feature of the algorithm is using operator information delayed by a full cycle, which makes the algorithm compatible with parallel and distributed implementations, and attractive due to weakened synchronization requirements across blocks. We prove that $\texttt{ADUCA}$ attains (near) optimal global oracle complexity as a function of target error $ε>0,$ scaling with $1/ε$ for monotone operators, or with $\log^2(1/ε)$ for operators that are strongly monotone.