Variance-Gated Ensembles: An Epistemic-Aware Framework for Uncertainty Estimation

📅 2026-02-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of accurately decomposing aleatoric and epistemic uncertainty at the sample level under limited ensembles or distributional shift. The authors propose a differentiable framework that couples decision boundaries with predictive variance through a signal-to-noise ratio gating mechanism, enabling sensitive modeling of epistemic uncertainty. Key innovations include a variance-gating mechanism—comprising the VGMU score and the VGN normalization layer—a closed-form computation of vector-Jacobian products, and per-class learnable probabilistic normalization, collectively overcoming the limitations of traditional additive decomposition. The method supports end-to-end training, matches or exceeds existing information-theoretic baselines in performance, and delivers a computationally efficient, scalable solution for cognitive-aware uncertainty estimation.

Technology Category

Application Category

📝 Abstract
Machine learning applications require fast and reliable per-sample uncertainty estimation. A common approach is to use predictive distributions from Bayesian or approximation methods and additively decompose uncertainty into aleatoric (i.e., data-related) and epistemic (i.e., model-related) components. However, additive decomposition has recently been questioned, with evidence that it breaks down when using finite-ensemble sampling and/or mismatched predictive distributions. This paper introduces Variance-Gated Ensembles (VGE), an intuitive, differentiable framework that injects epistemic sensitivity via a signal-to-noise gate computed from ensemble statistics. VGE provides: (i) a Variance-Gated Margin Uncertainty (VGMU) score that couples decision margins with ensemble predictive variance; and (ii) a Variance-Gated Normalization (VGN) layer that generalizes the variance-gated uncertainty mechanism to training via per-class, learnable normalization of ensemble member probabilities. We derive closed-form vector-Jacobian products enabling end-to-end training through ensemble sample mean and variance. VGE matches or exceeds state-of-the-art information-theoretic baselines while remaining computationally efficient. As a result, VGE provides a practical and scalable approach to epistemic-aware uncertainty estimation in ensemble models. An open-source implementation is available at: https://github.com/nextdevai/vge.
Problem

Research questions and friction points this paper is trying to address.

uncertainty estimation
epistemic uncertainty
aleatoric uncertainty
ensemble methods
predictive variance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variance-Gated Ensembles
Epistemic Uncertainty
Ensemble Learning
Uncertainty Estimation
Differentiable Framework
🔎 Similar Papers
No similar papers found.
H
H. Martin Gillis
Faculty of Computer Science, Dalhousie University, Halifax, NS
I
Isaac Xu
Faculty of Computer Science, Dalhousie University, Halifax, NS
Thomas Trappenberg
Thomas Trappenberg
Professor of Computer Science, Dalhousie University
Computational NeuroscienceMachine learningNeurocognitve Robotics