🤖 AI Summary
Long-range weather forecasting faces dual challenges: error accumulation in autoregressive prediction and scarcity of slowly varying climate signals in reanalysis data. To address these, we propose a long-horizon knowledge distillation framework: a short-step autoregressive deep learning Earth system model (DLESyM) generates high-fidelity, millennial-scale synthetic climate data; this data distills a single-step, probabilistic, large-step generative model that directly produces subseasonal-to-seasonal (S2S) forecasts. We demonstrate for the first time that AI-synthesized data at unprecedented scale significantly enhances long-range forecast skill. Our method compresses hundred-step autoregression into single-step inference, with performance monotonically improving as synthetic data volume increases. In perfect-model experiments, the distilled model surpasses climatological baselines and approaches teacher-model accuracy. After fine-tuning on ERA5, it achieves S2S forecast skill competitive with the ECMWF ensemble prediction system in realistic settings.
📝 Abstract
Accurate long-range weather forecasting remains a major challenge for AI models, both because errors accumulate over autoregressive rollouts and because reanalysis datasets used for training offer a limited sample of the slow modes of climate variability underpinning predictability. Most AI weather models are autoregressive, producing short lead forecasts that must be repeatedly applied to reach subseasonal-to-seasonal (S2S) or seasonal lead times, often resulting in instability and calibration issues. Long-timestep probabilistic models that generate long-range forecasts in a single step offer an attractive alternative, but training on the 40-year reanalysis record leads to overfitting, suggesting orders of magnitude more training data are required. We introduce long-range distillation, a method that trains a long-timestep probabilistic "student" model to forecast directly at long-range using a huge synthetic training dataset generated by a short-timestep autoregressive "teacher" model. Using the Deep Learning Earth System Model (DLESyM) as the teacher, we generate over 10,000 years of simulated climate to train distilled student models for forecasting across a range of timescales. In perfect-model experiments, the distilled models outperform climatology and approach the skill of their autoregressive teacher while replacing hundreds of autoregressive steps with a single timestep. In the real world, they achieve S2S forecast skill comparable to the ECMWF ensemble forecast after ERA5 fine-tuning. The skill of our distilled models scales with increasing synthetic training data, even when that data is orders of magnitude larger than ERA5. This represents the first demonstration that AI-generated synthetic training data can be used to scale long-range forecast skill.