Uncertainty Estimation using Variance-Gated Distributions

📅 2025-09-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Accurately quantifying sample-wise uncertainty in neural networks for high-stakes applications remains challenging, particularly in disentangling epistemic from aleatoric uncertainty—existing additive decomposition methods often fail to do so reliably. Method: We propose a novel uncertainty estimation and decomposition framework grounded in the signal-to-noise ratio (SNR) of class probability distributions. It introduces a variance-gating mechanism to dynamically model prediction reliability and an ensemble confidence factor to adaptively scale probabilistic outputs. Contribution/Results: Our approach reinterprets the “committee diversity collapse” phenomenon and overcomes fundamental limitations of additive decomposition, enabling finer-grained, well-calibrated separation of epistemic and aleatoric uncertainty. Extensive evaluation across multiple benchmark datasets demonstrates superior discriminability and calibration of uncertainty estimates. The framework advances both theoretical understanding and practical deployment of trustworthy AI systems.

Technology Category

Application Category

📝 Abstract
Evaluation of per-sample uncertainty quantification from neural networks is essential for decision-making involving high-risk applications. A common approach is to use the predictive distribution from Bayesian or approximation models and decompose the corresponding predictive uncertainty into epistemic (model-related) and aleatoric (data-related) components. However, additive decomposition has recently been questioned. In this work, we propose an intuitive framework for uncertainty estimation and decomposition based on the signal-to-noise ratio of class probability distributions across different model predictions. We introduce a variance-gated measure that scales predictions by a confidence factor derived from ensembles. We use this measure to discuss the existence of a collapse in the diversity of committee machines.
Problem

Research questions and friction points this paper is trying to address.

Evaluating per-sample uncertainty quantification in neural networks
Decomposing predictive uncertainty into epistemic and aleatoric components
Proposing intuitive framework using signal-to-noise ratio of distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variance-gated confidence scaling method
Signal-to-noise ratio based decomposition framework
Ensemble-derived uncertainty estimation technique
🔎 Similar Papers
No similar papers found.
H
H. Martin Gillis
Faculty of Computer Science, Dalhousie University, Halifax, NS Canada
I
Isaac Xu
Faculty of Computer Science, Dalhousie University, Halifax, NS Canada
Thomas Trappenberg
Thomas Trappenberg
Professor of Computer Science, Dalhousie University
Computational NeuroscienceMachine learningNeurocognitve Robotics