🤖 AI Summary
This work addresses β-divergence nonnegative matrix factorization (β-NMF) for β ∈ [1, 2], a canonical multi-convex optimization problem. We propose the Block Majorize-Minimize Extrapolation method (BMMe), which integrates block-coordinate updates, an adaptive extrapolation parameter scheme, and a dynamic Bregman divergence design. BMMe establishes, for the first time, the equivalence between block majorize-minimize algorithms and block mirror descent. Moreover, it yields the first multiplicative extrapolation algorithm for β-NMF with theoretical convergence guarantees. We prove subsequential convergence of the generated iterates. Empirical evaluations demonstrate that BMMe achieves 30–50% higher iteration efficiency than state-of-the-art methods on medium-to-large-scale tasks, significantly accelerating convergence.
📝 Abstract
We propose a Block Majorization Minimization method with Extrapolation (BMMe) for solving a class of multi-convex optimization problems. The extrapolation parameters of BMMe are updated using a novel adaptive update rule. By showing that block majorization minimization can be reformulated as a block mirror descent method, with the Bregman divergence adaptively updated at each iteration, we establish subsequential convergence for BMMe. We use this method to design efficient algorithms to tackle nonnegative matrix factorization problems with the $eta$-divergences ($eta$-NMF) for $etain [1,2]$. These algorithms, which are multiplicative updates with extrapolation, benefit from our novel results that offer convergence guarantees. We also empirically illustrate the significant acceleration of BMMe for $eta$-NMF through extensive experiments.