Escaping Saddle Points for Nonsmooth Weakly Convex Functions via Perturbed Proximal Algorithms

📅 2021-02-04
🏛️ arXiv.org
📈 Citations: 6
Influential: 1
📄 PDF
🤖 AI Summary
For nonsmooth weakly convex optimization, existing methods struggle to escape strict saddle points—hindering convergence to local minima. Method: This paper proposes a family of perturbed proximal algorithms—including perturbed proximal point, proximal gradient, and proximal linear variants—to address this challenge. Contributions/Results: We establish, for the first time, a verifiable characterization of ε-approximate local minima for nonsmooth weakly convex functions and provide the first theoretical guarantee for escaping strict saddle points. By integrating perturbed optimization, nonsmooth analysis, and saddle-point escape theory, all three algorithms achieve an iteration complexity of O(ε⁻² log d) under standard assumptions to compute an ε-approximate local minimum. This yields the first polynomial-time convergence guarantee for escaping saddle points in nonsmooth optimization.
📝 Abstract
We propose perturbed proximal algorithms that can provably escape strict saddles for nonsmooth weakly convex functions. The main results are based on a novel characterization of -approximate local minimum for nonsmooth functions, and recent developments on perturbed gradient methods for escaping saddle points for smooth problems. Specifically, we show that under standard assumptions, the perturbed proximal point, perturbed proximal gradient and perturbed proximal linear algorithms find -approximate local minimum for nonsmooth weakly convex functions in O( −2 log(d)) iterations, where d is the dimension of the problem. Keywords— Nonsmooth Optimization, Saddle Point, Perturbed Proximal Algorithms
Problem

Research questions and friction points this paper is trying to address.

Escaping saddle points for nonsmooth weakly convex functions
Developing perturbed proximal algorithms for nonsmooth optimization
Achieving ε-approximate local minimum efficiently in high dimensions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Perturbed proximal algorithms for nonsmooth functions
Novel characterization of approximate local minimum
Efficient iteration complexity for escaping saddles
🔎 Similar Papers
No similar papers found.