SatDepth: A Novel Dataset for Satellite Image Matching

πŸ“… 2025-03-17
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge of pixel-level correspondence estimation in satellite image matching under large viewpoint changes, multi-temporal acquisitions, and strong rotational disparities. To this end, we introduce SatDepthβ€”the first dense ground-truth dataset specifically designed for satellite imagery. Methodologically, we propose a rotation-augmented balanced sampling strategy, enabling robust training and evaluation under substantial rotational differences for the first time. Dense disparity ground truth is generated via multi-revisit image registration. We conduct systematic benchmarking and ablation studies on state-of-the-art deep matching models, including SuperPoint and LoFTR. Experimental results demonstrate that rotation-augmented training improves matching accuracy by up to 40%, significantly outperforming models trained on terrestrial image datasets. SatDepth establishes a new benchmark and provides a practical technical pathway for satellite image matching.

Technology Category

Application Category

πŸ“ Abstract
Recent advances in deep-learning based methods for image matching have demonstrated their superiority over traditional algorithms, enabling correspondence estimation in challenging scenes with significant differences in viewing angles, illumination and weather conditions. However, the existing datasets, learning frameworks, and evaluation metrics for the deep-learning based methods are limited to ground-based images recorded with pinhole cameras and have not been explored for satellite images. In this paper, we present ``SatDepth'', a novel dataset that provides dense ground-truth correspondences for training image matching frameworks meant specifically for satellite images. Satellites capture images from various viewing angles and tracks through multiple revisits over a region. To manage this variability, we propose a dataset balancing strategy through a novel image rotation augmentation procedure. This procedure allows for the discovery of corresponding pixels even in the presence of large rotational differences between the images. We benchmark four existing image matching frameworks using our dataset and carry out an ablation study that confirms that the models trained with our dataset with rotation augmentation outperform (up to 40% increase in precision) the models trained with other datasets, especially when there exist large rotational differences between the images.
Problem

Research questions and friction points this paper is trying to address.

Lack of datasets for satellite image matching
Challenges in handling rotational differences in satellite images
Need for improved deep-learning frameworks for satellite imagery
Innovation

Methods, ideas, or system contributions that make the work stand out.

Introduces SatDepth dataset for satellite image matching
Proposes rotation augmentation for handling image variability
Benchmarks frameworks, shows 40% precision improvement
πŸ”Ž Similar Papers
No similar papers found.