Manifold learning and optimization using tangent space proxies

📅 2025-01-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Differential geometric operators—such as tangent spaces, logarithmic maps, and vector transport—are computationally expensive and难以 scalable to implicit manifolds (e.g., point clouds) in manifold learning. Method: This paper proposes the *atlas graph* representation framework, which constructs a graph structure from overlapping coordinate charts to enable scalable approximations of key Riemannian operators, unifying treatment of both parametric manifolds (e.g., Grassmann manifolds) and parameterization-free implicit manifolds. Contribution/Results: The framework enables, for the first time, parameterization-free Riemannian optimization and downstream modeling (e.g., Riemannian SVM). Experiments demonstrate: (i) accelerated first-order optimization on the Grassmann manifold; (ii) geometrically consistent atlas graph reconstruction from high-contrast image-patch point clouds; and (iii) significantly improved performance and robustness of Riemannian SVM on high-dimensional, high-noise data.

Technology Category

Application Category

📝 Abstract
We present a framework for efficiently approximating differential-geometric primitives on arbitrary manifolds via construction of an atlas graph representation, which leverages the canonical characterization of a manifold as a finite collection, or atlas, of overlapping coordinate charts. We first show the utility of this framework in a setting where the manifold is expressed in closed form, specifically, a runtime advantage, compared with state-of-the-art approaches, for first-order optimization over the Grassmann manifold. Moreover, using point cloud data for which a complex manifold structure was previously established, i.e., high-contrast image patches, we show that an atlas graph with the correct geometry can be directly learned from the point cloud. Finally, we demonstrate that learning an atlas graph enables downstream key machine learning tasks. In particular, we implement a Riemannian generalization of support vector machines that uses the learned atlas graph to approximate complex differential-geometric primitives, including Riemannian logarithms and vector transports. These settings suggest the potential of this framework for even more complex settings, where ambient dimension and noise levels may be much higher.
Problem

Research questions and friction points this paper is trying to address.

Manifold Learning
Grassmann Manifolds
Support Vector Machines
Innovation

Methods, ideas, or system contributions that make the work stand out.

Manifold Learning
Grassmann Shapes
Point Cloud Processing
R
Ryan A. Robinett
Department of Computer Science, University of Chicago, Chicago, IL
Lorenzo Orecchia
Lorenzo Orecchia
University of Chicago, Computer Science
AlgorithmsOptimizationMachine Learning
S
Samantha J. Riesenfeld
Pritzker School of Molecular Engineering, University of Chicago, Chicago, IL; Department of Medicine, University of Chicago, Chicago, IL; Committee on Immunology, University of Chicago, Chicago, IL; Institute for Biophysical Dynamics, University of Chicago, Chicago, IL; CZ Biohub Chicago, LLC, Chicago, Illinois 60642; NSF-Simons National Institute for Theory and Mathematics in Biology, Chicago, IL 60611