Extending confidence calibration to generalised measures of variation

📅 2026-02-13
📈 Citations: 0
Influential: 0
📄 PDF

Technology Category

Application Category

📝 Abstract
We propose the Variation Calibration Error (VCE) metric for assessing the calibration of machine learning classifiers. The metric can be viewed as an extension of the well-known Expected Calibration Error (ECE) which assesses the calibration of the maximum probability or confidence. Other ways of measuring the variation of a probability distribution exist which have the advantage of taking into account the full probability distribution, for example the Shannon entropy. We show how the ECE approach can be extended from assessing confidence calibration to assessing the calibration of any metric of variation. We present numerical examples upon synthetic predictions which are perfectly calibrated by design, demonstrating that, in this scenario, the VCE has the desired property of approaching zero as the number of data samples increases, in contrast to another entropy-based calibration metric (the UCE) which has been proposed in the literature.
Problem

Research questions and friction points this paper is trying to address.

confidence calibration
Expected Calibration Error
Variation Calibration Error
probability distribution
entropy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variation Calibration Error
Expected Calibration Error
confidence calibration
probability distribution
Shannon entropy
🔎 Similar Papers
No similar papers found.
Andrew Thompson
Andrew Thompson
National Physical Laboratory
sparse estimationdata sciencesignal processing
V
Vivek Desai
National Physical Laboratory, Hampton Road, Teddington, TW11 0LW, United Kingdom