Generate the Forest before the Trees -- A Hierarchical Diffusion model for Climate Downscaling

📅 2025-06-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional climate downscaling methods incur high computational costs; while recent AI models—such as diffusion models—enable ensemble generation and mitigate over-smoothing, they remain computationally intensive. This paper proposes the Hierarchical Diffusion Downscaling (HDD) model, which introduces a coarse-to-fine hierarchical sampling mechanism and a multi-scale downsampling architecture to substantially reduce per-pixel computation. HDD employs hierarchical denoising training and multi-resolution inputs, preserving 0.25° spatial fidelity while reducing pixel-level computational load by over 50%. Crucially, HDD achieves zero-shot transfer across CMIP6 models—the first such demonstration—enabling seamless multi-model adaptation and scalable ensemble forecasting. Evaluated on ERA5 and CMIP6 data, HDD delivers high fidelity, computational efficiency, and strong generalization, offering a scalable generative solution for local climate planning and impact assessment.

Technology Category

Application Category

📝 Abstract
Downscaling is essential for generating the high-resolution climate data needed for local planning, but traditional methods remain computationally demanding. Recent years have seen impressive results from AI downscaling models, particularly diffusion models, which have attracted attention due to their ability to generate ensembles and overcome the smoothing problem common in other AI methods. However, these models typically remain computationally intensive. We introduce a Hierarchical Diffusion Downscaling (HDD) model, which introduces an easily-extensible hierarchical sampling process to the diffusion framework. A coarse-to-fine hierarchy is imposed via a simple downsampling scheme. HDD achieves competitive accuracy on ERA5 reanalysis datasets and CMIP6 models, significantly reducing computational load by running on up to half as many pixels with competitive results. Additionally, a single model trained at 0.25° resolution transfers seamlessly across multiple CMIP6 models with much coarser resolution. HDD thus offers a lightweight alternative for probabilistic climate downscaling, facilitating affordable large-ensemble high-resolution climate projections. See a full code implementation at: https://github.com/HDD-Hierarchical-Diffusion-Downscaling/HDD-Hierarchical-Diffusion-Downscaling.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational load in climate downscaling models
Overcoming smoothing issues in AI-based downscaling methods
Enabling high-resolution climate data for local planning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hierarchical Diffusion Downscaling (HDD) model
Coarse-to-fine hierarchical sampling process
Lightweight probabilistic climate downscaling
🔎 Similar Papers
No similar papers found.
D
Declan J. Curran
School of Computer Science and Engineering, University of New South Wales, Sydney, New South Wales, Australia
Sanaa Hobeichi
Sanaa Hobeichi
The University of New South Wales, Sydney Australia
Applied Machine LearningClimate ChangeClimate ExtremesWater cycle
Hira Saleem
Hira Saleem
The University of New South Wales
Weather-Climate ModellingPhysics Informed LearningExplainable AISpatio-Temporal Forecasting
Hao Xue
Hao Xue
University of New South Wales
human mobilityspatio-temporal data mining
F
Flora D. Salim
School of Computer Science and Engineering, University of New South Wales, Sydney, New South Wales, Australia