Building a stable classifier with the inflated argmax

📅 2024-05-22
🏛️ Neural Information Processing Systems
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
Multi-class classifiers suffer from poor stability due to the discrete argmax decision rule, making them highly sensitive to minor perturbations in training data. Method: This paper proposes a stabilization framework integrating Bagging with “inflated argmax”: robust continuous scores are first generated via resampling-based ensembling, and then a smoothed, relaxed inflated argmax outputs a candidate label set. Contribution/Results: We establish the first distribution-free, class-number- and input-dimension-agnostic stability guarantee for multi-class classification. Inflated argmax serves as a provably stable, continuous surrogate for argmax, overcoming inherent limitations of discrete decision-making. Empirical evaluation on standard benchmark datasets demonstrates substantial stability improvement without any accuracy degradation—achieving, for the first time, simultaneous and rigorous guarantees on both stability and accuracy.

Technology Category

Application Category

📝 Abstract
We propose a new framework for algorithmic stability in the context of multiclass classification. In practice, classification algorithms often operate by first assigning a continuous score (for instance, an estimated probability) to each possible label, then taking the maximizer -- i.e., selecting the class that has the highest score. A drawback of this type of approach is that it is inherently unstable, meaning that it is very sensitive to slight perturbations of the training data, since taking the maximizer is discontinuous. Motivated by this challenge, we propose a pipeline for constructing stable classifiers from data, using bagging (i.e., resampling and averaging) to produce stable continuous scores, and then using a stable relaxation of argmax, which we call the"inflated argmax,"to convert these scores to a set of candidate labels. The resulting stability guarantee places no distributional assumptions on the data, does not depend on the number of classes or dimensionality of the covariates, and holds for any base classifier. Using a common benchmark data set, we demonstrate that the inflated argmax provides necessary protection against unstable classifiers, without loss of accuracy.
Problem

Research questions and friction points this paper is trying to address.

Addressing instability in multiclass classification algorithms
Proposing stable classifiers via bagging and inflated argmax
Ensuring stability without accuracy loss in classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bagging for stable continuous scores
Inflated argmax for stable label conversion
No distributional assumptions on data
🔎 Similar Papers
No similar papers found.