🤖 AI Summary
This paper addresses atomic norm minimization problems—such as ℓ₁-norm minimization—subject to quadratic constraints, a class of nonsmooth convex optimization problems incompatible with standard accelerated first-order methods. To overcome this limitation, we propose two equivalent reformulations: (1) a smooth convex optimization model via Nesterov’s smoothing technique, and (2) a strongly convex–strongly concave minimax formulation, ensuring well-conditioned dual structure. Both formulations admit the optimal O(1/k²) convergence rate—the first such guarantee for this class of constrained linear inverse problems (LIPs). Leveraging these models, we design FLIPS, a dedicated solver that significantly outperforms conventional O(1/k) methods. We validate FLIPS on binary choice modeling, compressed sensing, and image denoising, demonstrating consistent empirical gains. An extensible MATLAB toolbox is publicly released.
📝 Abstract
We consider the constrained Linear Inverse Problem (LIP), where a certain atomic norm (like the $ell_1 $ norm) is minimized subject to a quadratic constraint. Typically, such cost functions are non-differentiable which makes them not amenable to the fast optimization methods existing in practice. We propose two equivalent reformulations of the constrained LIP with improved convex regularity: (i) a smooth convex minimization problem, and (ii) a strongly convex min-max problem. These problems could be solved by applying existing acceleration-based convex optimization methods which provide better $ O left( frac{1}{k^2}
ight) $ theoretical convergence guarantee, improving upon the current best rate of $ O left( frac{1}{k}
ight) $. We also provide a novel algorithm named the Fast Linear Inverse Problem Solver (FLIPS), which is tailored to maximally exploit the structure of the reformulations. We demonstrate the performance of FLIPS on the classical problems of Binary Selection, Compressed Sensing, and Image Denoising. We also provide open source exttt{MATLAB} package for these three examples, which can be easily adapted to other LIPs.