🤖 AI Summary
Conformal prediction (CP) relies on p-values and distributional assumptions, limiting its flexibility and robustness—especially in small-sample or non-stationary settings. Method: This paper introduces conformal e-prediction (CEP), a novel framework that replaces p-values with e-values within the CP paradigm, enabling model-free calibration at arbitrary confidence levels under exchangeability and sequential prediction theory. CEP supports composable, cumulative online inference without parametric assumptions. Contribution/Results: We formalize conditional and cross-conformal e-predictors and establish their statistical validity via rigorous theoretical guarantees. Empirically, CEP demonstrates superior robustness and adaptability under limited data and distribution shift. By decoupling uncertainty quantification from p-value-based hypothesis testing, CEP provides a more flexible, scalable, and assumption-light foundation for trustworthy AI systems.