🤖 AI Summary
This work addresses the NP-hard problem of indicator matrix optimization in machine learning by proposing a differentiable relaxation: the Relaxed Indicator Matrix (RIM) manifold—rigorously proven for the first time to be a smooth Riemannian manifold. Methodologically, it generalizes the classical bistochastic manifold to the more flexible and computationally efficient RIM manifold, and develops tailored Riemannian gradient updates, retractions, and a fast geodesic approximation algorithm, reducing optimization complexity from $O(n^3)$ to $O(n)$. Theoretically sound and practically scalable, the approach achieves state-of-the-art performance on large-scale tasks: it significantly outperforms prior methods on million-image denoising and graph-cut clustering; notably, Ratio Cut clustering attains optimal segmentation accuracy across multiple benchmark datasets.
📝 Abstract
The indicator matrix plays an important role in machine learning, but optimizing it is an NP-hard problem. We propose a new relaxation of the indicator matrix and prove that this relaxation forms a manifold, which we call the Relaxed Indicator Matrix Manifold (RIM manifold). Based on Riemannian geometry, we develop a Riemannian toolbox for optimization on the RIM manifold. Specifically, we provide several methods of Retraction, including a fast Retraction method to obtain geodesics. We point out that the RIM manifold is a generalization of the double stochastic manifold, and it is much faster than existing methods on the double stochastic manifold, which has a complexity of ( mathcal{O}(n^3) ), while RIM manifold optimization is ( mathcal{O}(n) ) and often yields better results. We conducted extensive experiments, including image denoising, with millions of variables to support our conclusion, and applied the RIM manifold to Ratio Cut, achieving clustering results that outperform the state-of-the-art methods. Our Code in href{https://github.com/Yuan-Jinghui/Riemannian-Optimization-on-Relaxed-Indicator-Matrix-Manifold}{https://github.com/Yuan-Jinghui/Riemannian-Optimization-on-Relaxed-Indicator-Matrix-Manifold}.