🤖 AI Summary
This work addresses the performance degradation of disk-based high-dimensional graph indexing caused by the Euclidean–geodesic mismatch. To mitigate this issue, the authors propose a geometry-aware approximate nearest neighbor search method that integrates local intrinsic dimensionality (LID) into both graph construction and dynamic beam search. This approach adaptively aligns with the underlying data manifold without requiring any preset hyperparameters. The method substantially enhances topological connectivity and retrieval accuracy for high-dimensional data: on GIST1M, it achieves 5.8× higher throughput than DiskANN at 95% recall, and on the billion-scale SIFT1B dataset, it reduces query latency by 3× under high-recall conditions, while maintaining comparable performance on low-dimensional data.
📝 Abstract
Graph-based Approximate Nearest Neighbor (ANN) search often suffers from performance degradation in high-dimensional spaces due to the ``Euclidean-Geodesic mismatch,''where greedy routing diverges from the underlying data manifold. To address this, we propose Manifold-Consistent Graph Indexing (MCGI), a geometry-aware and disk-resident indexing method that leverages Local Intrinsic Dimensionality (LID) to dynamically adapt search strategies to the data's intrinsic geometry. Unlike standard algorithms that treat dimensions uniformly, MCGI modulates its beam search budget based on in situ geometric analysis, eliminating dependency on static hyperparameters. Theoretical analysis confirms that MCGI enables improved approximation guarantees by preserving manifold-consistent topological connectivity. Empirically, MCGI achieves 5.8$\times$ higher throughput at 95\% recall on high-dimensional GIST1M compared to state-of-the-art DiskANN. On the billion-scale SIFT1B dataset, MCGI further validates its scalability by reducing high-recall query latency by 3$\times$, while maintaining performance parity on standard lower-dimensional datasets.