🤖 AI Summary
This work addresses classification in high-dimensional Hilbert metric spaces by proposing the first support vector machine (SVM) algorithm with polynomial time complexity. By integrating linear programming optimization with geometric modeling based on the Hilbert and Funk metrics, the method efficiently solves both hard-margin and soft-margin SVM problems as well as nearest-neighbor classification tasks. In contrast to prior approaches that either lack theoretical runtime guarantees or exhibit only exponential complexity, this approach achieves polynomial time complexity in the number of samples, ambient dimension, and the number of facets of the underlying convex body. This advancement substantially enhances the scalability and practicality of classification in high-dimensional non-Euclidean spaces.
📝 Abstract
Classifying points in high dimensional spaces is a fundamental geometric problem in machine learning. In this paper, we address classifying points in the $d$-dimensional Hilbert polygonal metric. The Hilbert metric is a generalization of the Cayley-Klein hyperbolic distance to arbitrary convex bodies and has a diverse range of applications in machine learning and convex geometry. We first present an efficient LP-based algorithm in the metric for the large-margin SVM problem. Our algorithm runs in time polynomial to the number of points, bounding facets, and dimension. This is a significant improvement on previous works, which either provide no theoretical guarantees on running time, or suffer from exponential runtime. We also consider the closely related Funk metric. We also present efficient algorithms for the soft-margin SVM problem and for nearest neighbor-based classification in the Hilbert metric.