๐ค AI Summary
This paper addresses the degradation of probabilistic forecast calibration in dynamic data streams caused by distributional shift, feedback loops, and adversarial perturbations. We propose the first general online calibration framework grounded in Blackwell approachabilityโa theoretically rigorous foundation for sequential decision-making under uncertainty. Our method provides strong calibration guarantees in compact output spaces (e.g., classification and bounded regression) and enables lossless post-hoc recalibration of arbitrary pre-trained predictors. Technically, it unifies insights from Blackwell approachability theory, online optimization, and gradient-based updates, and introduces task-specific efficient algorithms for both classification and regression. Empirical evaluation demonstrates substantial improvements in calibration quality for energy system forecasting, with marked gains in robustness and practical utility for downstream decision-making tasks.
๐ Abstract
Real-world data streams can change unpredictably due to distribution shifts, feedback loops and adversarial actors, which challenges the validity of forecasts. We present a forecasting framework ensuring valid uncertainty estimates regardless of how data evolves. Leveraging the concept of Blackwell approachability from game theory, we introduce a forecasting framework that guarantees calibrated uncertainties for outcomes in any compact space (e.g., classification or bounded regression). We extend this framework to recalibrate existing forecasters, guaranteeing calibration without sacrificing predictive performance. We implement both general-purpose gradient-based algorithms and algorithms optimized for popular special cases of our framework. Empirically, our algorithms improve calibration and downstream decision-making for energy systems.