Long-Range Distillation: Distilling 10,000 Years of Simulated Climate into Long Timestep AI Weather Models

📅 2025-12-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Long-range weather forecasting faces dual challenges: error accumulation in autoregressive prediction and scarcity of slowly varying climate signals in reanalysis data. To address these, we propose a long-horizon knowledge distillation framework: a short-step autoregressive deep learning Earth system model (DLESyM) generates high-fidelity, millennial-scale synthetic climate data; this data distills a single-step, probabilistic, large-step generative model that directly produces subseasonal-to-seasonal (S2S) forecasts. We demonstrate for the first time that AI-synthesized data at unprecedented scale significantly enhances long-range forecast skill. Our method compresses hundred-step autoregression into single-step inference, with performance monotonically improving as synthetic data volume increases. In perfect-model experiments, the distilled model surpasses climatological baselines and approaches teacher-model accuracy. After fine-tuning on ERA5, it achieves S2S forecast skill competitive with the ECMWF ensemble prediction system in realistic settings.

Technology Category

Application Category

📝 Abstract
Accurate long-range weather forecasting remains a major challenge for AI models, both because errors accumulate over autoregressive rollouts and because reanalysis datasets used for training offer a limited sample of the slow modes of climate variability underpinning predictability. Most AI weather models are autoregressive, producing short lead forecasts that must be repeatedly applied to reach subseasonal-to-seasonal (S2S) or seasonal lead times, often resulting in instability and calibration issues. Long-timestep probabilistic models that generate long-range forecasts in a single step offer an attractive alternative, but training on the 40-year reanalysis record leads to overfitting, suggesting orders of magnitude more training data are required. We introduce long-range distillation, a method that trains a long-timestep probabilistic "student" model to forecast directly at long-range using a huge synthetic training dataset generated by a short-timestep autoregressive "teacher" model. Using the Deep Learning Earth System Model (DLESyM) as the teacher, we generate over 10,000 years of simulated climate to train distilled student models for forecasting across a range of timescales. In perfect-model experiments, the distilled models outperform climatology and approach the skill of their autoregressive teacher while replacing hundreds of autoregressive steps with a single timestep. In the real world, they achieve S2S forecast skill comparable to the ECMWF ensemble forecast after ERA5 fine-tuning. The skill of our distilled models scales with increasing synthetic training data, even when that data is orders of magnitude larger than ERA5. This represents the first demonstration that AI-generated synthetic training data can be used to scale long-range forecast skill.
Problem

Research questions and friction points this paper is trying to address.

Develops long-timestep AI models for stable long-range weather forecasting
Addresses overfitting in training due to limited real climate data
Uses synthetic climate data to scale forecast skill beyond reanalysis limits
Innovation

Methods, ideas, or system contributions that make the work stand out.

Using teacher-student distillation for long-range forecasting
Generating 10,000 years of synthetic climate data for training
Replacing hundreds of autoregressive steps with a single timestep
🔎 Similar Papers
No similar papers found.
S
Scott A. Martin
NVIDIA Research, Santa Clara, CA, USA; School of Oceanography, University of Washington, Seattle, WA, USA
Noah Brenowitz
Noah Brenowitz
NVIDIA
Climate ScienceApplied MathematicsMachine Learning
D
Dale Durran
NVIDIA Research, Santa Clara, CA, USA; Department of Atmospheric & Climate Science, University of Washington, Seattle, WA, USA
M
Mike Pritchard
NVIDIA Research, Santa Clara, CA, USA