🤖 AI Summary
This work addresses the challenge of accurately decomposing aleatoric and epistemic uncertainty at the sample level under limited ensembles or distributional shift. The authors propose a differentiable framework that couples decision boundaries with predictive variance through a signal-to-noise ratio gating mechanism, enabling sensitive modeling of epistemic uncertainty. Key innovations include a variance-gating mechanism—comprising the VGMU score and the VGN normalization layer—a closed-form computation of vector-Jacobian products, and per-class learnable probabilistic normalization, collectively overcoming the limitations of traditional additive decomposition. The method supports end-to-end training, matches or exceeds existing information-theoretic baselines in performance, and delivers a computationally efficient, scalable solution for cognitive-aware uncertainty estimation.
📝 Abstract
Machine learning applications require fast and reliable per-sample uncertainty estimation. A common approach is to use predictive distributions from Bayesian or approximation methods and additively decompose uncertainty into aleatoric (i.e., data-related) and epistemic (i.e., model-related) components. However, additive decomposition has recently been questioned, with evidence that it breaks down when using finite-ensemble sampling and/or mismatched predictive distributions. This paper introduces Variance-Gated Ensembles (VGE), an intuitive, differentiable framework that injects epistemic sensitivity via a signal-to-noise gate computed from ensemble statistics. VGE provides: (i) a Variance-Gated Margin Uncertainty (VGMU) score that couples decision margins with ensemble predictive variance; and (ii) a Variance-Gated Normalization (VGN) layer that generalizes the variance-gated uncertainty mechanism to training via per-class, learnable normalization of ensemble member probabilities. We derive closed-form vector-Jacobian products enabling end-to-end training through ensemble sample mean and variance. VGE matches or exceeds state-of-the-art information-theoretic baselines while remaining computationally efficient. As a result, VGE provides a practical and scalable approach to epistemic-aware uncertainty estimation in ensemble models. An open-source implementation is available at: https://github.com/nextdevai/vge.