🤖 AI Summary
To address the low efficiency and poor interpretability of uncertainty modeling in large-scale forecasting tasks, this paper proposes the Generative Quantile Bayesian Prediction (GQBP) framework. GQBP departs from conventional density estimation paradigms by directly learning a generative mapping from inputs to quantiles—marking the first integration of generative modeling into quantile forecasting, while synergistically combining Bayesian inference with quantile regression. Unlike conformal prediction and likelihood-based approaches, GQBP imposes no parametric distributional assumptions, supports arbitrary quantile levels flexibly, and exhibits strong scalability. Empirical evaluation on normal-normal learning and causal inference tasks demonstrates that GQBP achieves superior uncertainty calibration and higher computational efficiency, thereby significantly enhancing predictive reliability and practical utility.
📝 Abstract
Prediction is a central task of machine learning. Our goal is to solve large scale prediction problems using Generative Quantile Bayesian Prediction (GQBP).By directly learning predictive quantiles rather than densities we achieve a number of theoretical and practical advantages. We contrast our approach with state-of-the-art methods including conformal prediction, fiducial prediction and marginal likelihood. Our distinguishing feature of our method is the use of generative methods for predictive quantile maps. We illustrate our methodology for normal-normal learning and causal inference. Finally, we conclude with directions for future research.