Accelerated Frank-Wolfe Algorithms: Complementarity Conditions and Sparsity

📅 2025-11-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work studies accelerated Frank–Wolfe optimization for smooth convex functions over compact convex sets, focusing on polyhedral and spectral domains (e.g., nuclear-norm balls). We propose two novel algorithms: (1) a polyhedral algorithm requiring only a linear optimization oracle (LOO), achieving LOO complexity $O(1/sqrt{epsilon})$ independent of ambient dimension—scaling instead with the sparsity of the solution; and (2) a hybrid algorithm for matrix domains that integrates sparse projections (e.g., low-rank SVD) with complementary slackness analysis, attaining $O(1/sqrt{epsilon})$ convergence without full SVD, with complexity independent of both dimension and rank. Both methods circumvent the dimension-dependent bottlenecks inherent in conventional acceleration schemes, achieving optimal first-order oracle complexity.

Technology Category

Application Category

📝 Abstract
We develop new accelerated first-order algorithms in the Frank-Wolfe (FW) family for minimizing smooth convex functions over compact convex sets, with a focus on two prominent constraint classes: (1) polytopes and (2) matrix domains given by the spectrahedron and the unit nuclear-norm ball. A key technical ingredient is a complementarity condition that captures solution sparsity -- face dimension for polytopes and rank for matrices. We present two algorithms: (1) a purely linear optimization oracle (LOO) method for polytopes that has optimal worst-case first-order (FO) oracle complexity and, aside of a finite emph{burn-in} phase and up to a logarithmic factor, has LOO complexity that scales with $r/sqrt{epsilon}$, where $epsilon$ is the target accuracy and $r$ is the solution sparsity $r$ (independently of the ambient dimension), and (2) a hybrid scheme that combines FW with a sparse projection oracle (e.g., low-rank SVDs for matrix domains with low-rank solutions), which also has optimal FO oracle complexity, and after a finite burn-in phase, only requires $O(1/sqrt{epsilon})$ sparse projections and LOO calls (independently of both the ambient dimension and the rank of optimal solutions). Our results close a gap on how to accelerate recent advancements in linearly-converging FW algorithms for strongly convex optimization, without paying the price of the dimension.
Problem

Research questions and friction points this paper is trying to address.

Minimizing smooth convex functions over compact convex sets efficiently
Developing accelerated Frank-Wolfe algorithms for polytopes and matrix domains
Achieving optimal complexity independent of ambient dimension and solution rank
Innovation

Methods, ideas, or system contributions that make the work stand out.

Accelerated Frank-Wolfe algorithms with complementarity conditions
Linear optimization oracle method scaling with solution sparsity
Hybrid scheme combining Frank-Wolfe with sparse projections
🔎 Similar Papers
No similar papers found.