🤖 AI Summary
This paper addresses the out-of-distribution (OOD) detection problem for high-dimensional data (e.g., CIFAR-100) under *no distributional assumptions*. To this end, it proposes a geometric modeling framework based on *adaptive hyperconic clusters*: for the first time, hyperconic structures are introduced to quantify the *local angular boundaries* of in-distribution (ID) sample support in feature space—without requiring OOD supervision or prior distributional knowledge—and dynamically adapt to the underlying data manifold geometry. The method integrates *k-nearest-neighbor angular distance maximization*, *hyperconic space partitioning*, and *silhouette-based boundary estimation* to achieve a compact geometric characterization of ID regions. On the CIFAR-100 benchmark, it achieves state-of-the-art performance for both Near-OOD and Far-OOD detection—attaining the lowest FPR@95 and highest AUROC—while being trained *exclusively on ID data*, with *zero OOD samples used during training*.
📝 Abstract
Recent advances in the field of out-of-distribution (OOD) detection have placed great emphasis on learning better representations suited to this task. While there are distance-based approaches, distributional awareness has seldom been exploited for better performance. We present HAC$_k$-OOD, a novel OOD detection method that makes no distributional assumption about the data, but automatically adapts to its distribution. Specifically, HAC$_k$-OOD constructs a set of hypercones by maximizing the angular distance to neighbors in a given data-point's vicinity to approximate the contour within which in-distribution (ID) data-points lie. Experimental results show state-of-the-art FPR@95 and AUROC performance on Near-OOD detection and on Far-OOD detection on the challenging CIFAR-100 benchmark without explicitly training for OOD performance.