🤖 AI Summary
This paper addresses the unsupervised Gaussian two-group separation problem by proposing a skewness-based estimator for the optimal linear discriminant direction—without requiring class labels. We construct two affine-equivariant, skewness-driven estimators and prove their asymptotic covariance matrices are proportional, thereby unifying the performance benchmark. Theoretical analysis derives the limiting distributions of both estimators; leveraging affine equivariance and asymptotic theory, we establish statistical consistency guarantees. Extensive simulations validate the theoretical findings and demonstrate robustness and high accuracy even in small-sample regimes. The key contribution is the first systematic integration of skewness into unsupervised linear discrimination, yielding a novel paradigm that balances theoretical tractability with empirical effectiveness.
📝 Abstract
It is well-known that, in Gaussian two-group separation, the optimally discriminating projection direction can be estimated without any knowledge on the group labels. In this work, we
evision{gather} several such unsupervised estimators based on skewness and derive their limiting distributions. As one of our main results, we show that all affine equivariant estimators of the optimal direction have proportional asymptotic covariance matrices, making their comparison straightforward. Two of our four estimators are novel and two have been proposed already earlier. We use simulations to verify our results and to inspect the finite-sample behaviors of the estimators.