๐ค AI Summary
Accurate spatiotemporal prediction of internal ice-layer thickness from radar imagery is critical for improving ice-sheet dynamic monitoring and climate modeling. To address this, we propose GRIT (Graph-based Radar Ice Transformer), the first model to tightly integrate geometric graph learning with the Transformer architecture: an inductive graph neural network explicitly encodes the spatial topological structure of ice layers, while self-attention mechanisms capture long-range spatiotemporal dependencies across depth levelsโthereby effectively modeling the coupling between near-surface snow accumulation and deep-ice flow. Evaluated on a real-world radar imaging dataset, GRIT achieves a 18.7% reduction in mean absolute thickness prediction error over baseline graph neural networks, demonstrating its superiority in modeling the complex, multiscale dynamics of cryospheric systems.
๐ Abstract
Gaining a deeper understanding of the thickness and variability of internal ice layers in Radar imagery is essential in monitoring the snow accumulation, better evaluating ice dynamics processes, and minimizing uncertainties in climate models. Radar sensors, capable of penetrating ice, capture detailed radargram images of internal ice layers. In this work, we introduce GRIT, graph transformer for ice layer thickness. GRIT integrates an inductive geometric graph learning framework with an attention mechanism, designed to map the relationships between shallow and deeper ice layers. Compared to baseline graph neural networks, GRIT demonstrates consistently lower prediction errors. These results highlight the attention mechanism's effectiveness in capturing temporal changes across ice layers, while the graph transformer combines the strengths of transformers for learning long-range dependencies with graph neural networks for capturing spatial patterns, enabling robust modeling of complex spatiotemporal dynamics.