Order-Optimal Projection-Free Algorithm for Adversarially Constrained Online Convex Optimization

📅 2025-02-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Projection-based methods for constrained online convex optimization (COCO) in high-dimensional adversarial settings suffer from poor scalability due to expensive projection computations. Method: We propose the first projection-free algorithm for COCO, integrating a separation oracle, adaptive online gradient descent, and a Lyapunov-type surrogate function; it jointly optimizes regret and cumulative constraint violation via dynamic step sizes and block-wise updates. Contributions/Results: (i) First projection-free guarantees of $O(sqrt{T})$ regret and $O(sqrt{T}log T)$ cumulative constraint violation for general convex losses, or $O(log T)$ regret and $O(sqrt{Tlog T})$ violation for strongly convex losses; (ii) explicit trade-off between oracle query complexity ($ ilde{O}(T)$) and performance; (iii) closure of the theoretical gap between projection-free and projection-based approaches.

Technology Category

Application Category

📝 Abstract
Projection-based algorithms for constrained Online Convex Optimization (COCO) face scalability challenges in high-dimensional settings due to the computational complexity of projecting iterates onto constraint sets. This paper introduces a projection-free algorithm for COCO that achieves state-of-the-art performance guarantees while eliminating the need for projections. By integrating a separation oracle with adaptive Online Gradient Descent (OGD) and employing a Lyapunov-driven surrogate function, while dynamically adjusting step sizes using gradient norms, our method jointly optimizes the regret and cumulative constraint violation (CCV). We also use a blocked version of OGD that helps achieve tradeoffs betweeen the regret and CCV with the number of calls to the separation oracle. For convex cost functions, our algorithm attains an optimal regret of $mathcal{O}(sqrt{T})$ and a CCV of $mathcal{O}(sqrt{T} log T)$, matching the best-known projection-based results, while only using $ ilde{mathcal{O}}({T})$ calls to the separation oracle. The results also demonstrate a tradeoff where lower calls to the separation oracle increase the regret and the CCV. In the strongly convex setting, we further achieve a regret of $mathcal{O}(log T)$ and a CCV of $mathcal{O}(sqrt{Tlog T} )$, while requiring ${mathcal{O}}({T}^2)$ calls to the separation oracle. Further, tradeoff with the decreasing oracle calls is studied. These results close the gap between projection-free and projection-based approaches, demonstrating that projection-free methods can achieve performance comparable to projection-based counterparts.
Problem

Research questions and friction points this paper is trying to address.

Develops projection-free COCO algorithm
Optimizes regret and constraint violation
Reduces calls to separation oracle
Innovation

Methods, ideas, or system contributions that make the work stand out.

Projection-free algorithm for COCO
Integration of separation oracle with OGD
Lyapunov-driven surrogate function optimization
🔎 Similar Papers
No similar papers found.