A block-coordinate descent framework for non-convex composite optimization. Application to sparse precision matrix estimation

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of block coordinate descent (BCD) methods with convergence guarantees for nonconvex composite optimization by proposing a unified BCD framework that, for the first time, provides general convergence assurances for such problems. The framework accommodates a variety of mainstream update strategies—including variable-metric proximal gradient, proximal Newton, and alternating minimization—and naturally encompasses Graphical Lasso-type algorithms such as ISTA, Primal GLasso, and QUIC. When applied to sparse precision matrix estimation, the proposed method achieves state-of-the-art estimation accuracy while delivering up to 100-fold acceleration in iteration speed, substantially enhancing computational efficiency.

Technology Category

Application Category

📝 Abstract
Block-coordinate descent (BCD) is the method of choice to solve numerous large scale optimization problems, however their theoretical study for non-convex optimization, has received less attention. In this paper, we present a new block-coordinate descent (BCD) framework to tackle non-convex composite optimization problems, ensuring decrease of the objective function and convergence to a solution. This framework is general enough to include variable metric proximal gradient updates, proximal Newton updates, and alternated minimization updates. This generality allows to encompass three versions of the most used solvers in the sparse precision matrix estimation problem, deemed Graphical Lasso: graphical ISTA, Primal GLasso, and QUIC. We demonstrate the value of this new framework on non-convex sparse precision matrix estimation problems, providing convergence guarantees and up to a $100$-fold reduction in the number of iterations required to reach state-of-the-art estimation quality.
Problem

Research questions and friction points this paper is trying to address.

non-convex optimization
composite optimization
sparse precision matrix estimation
block-coordinate descent
Innovation

Methods, ideas, or system contributions that make the work stand out.

block-coordinate descent
non-convex composite optimization
sparse precision matrix estimation
convergence guarantee
Graphical Lasso
🔎 Similar Papers
No similar papers found.