Not every day is a sunny day: Synthetic cloud injection for deep land cover segmentation robustness evaluation across data sources

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Frequent cloud cover in tropical regions severely limits the availability of optical remote sensing imagery, while deep learning models often lose critical spatial and spectral details during downsampling. Method: This paper proposes (1) a lightweight Normalized Difference Index (NDI) injection technique integrated at the decoder’s end to preserve key spatial features; (2) a physically constrained, realistic cloud synthesis and injection framework to systematically evaluate model robustness under cloud occlusion; and (3) a multimodal fusion strategy combining Sentinel-1 SAR and Sentinel-2 optical data. Results: On the DFC2020 dataset, NDI injection improves mIoU by 1.99% and 2.78% for U-Net and DeepLabV3, respectively, under cloud-free conditions. Under cloudy conditions, radar–optical fusion significantly outperforms optical-only input, demonstrating the effectiveness and generalizability of the proposed approach under complex meteorological conditions.

Technology Category

Application Category

📝 Abstract
Supervised deep learning for land cover semantic segmentation (LCS) relies on labeled satellite data. However, most existing Sentinel-2 datasets are cloud-free, which limits their usefulness in tropical regions where clouds are common. To properly evaluate the extent of this problem, we developed a cloud injection algorithm that simulates realistic cloud cover, allowing us to test how Sentinel-1 radar data can fill in the gaps caused by cloud-obstructed optical imagery. We also tackle the issue of losing spatial and/or spectral details during encoder downsampling in deep networks. To mitigate this loss, we propose a lightweight method that injects Normalized Difference Indices (NDIs) into the final decoding layers, enabling the model to retain key spatial features with minimal additional computation. Injecting NDIs enhanced land cover segmentation performance on the DFC2020 dataset, yielding improvements of 1.99% for U-Net and 2.78% for DeepLabV3 on cloud-free imagery. Under cloud-covered conditions, incorporating Sentinel-1 data led to significant performance gains across all models compared to using optical data alone, highlighting the effectiveness of radar-optical fusion in challenging atmospheric scenarios.
Problem

Research questions and friction points this paper is trying to address.

Evaluating land cover segmentation robustness under cloud-covered conditions
Addressing information loss during encoder downsampling in deep networks
Developing cloud injection algorithm to simulate realistic atmospheric conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Simulates realistic cloud cover for robustness testing
Injects NDIs into decoding layers to retain features
Fuses Sentinel-1 radar data with optical imagery
🔎 Similar Papers
No similar papers found.
S
Sara Mobsite
ESPACE-DEV, French National Research Institute for Sustainable Development (IRD)
Renaud Hostache
Renaud Hostache
Institut de Recherche pour le Développementement, UMR Espace-Dev
FloodsDrougtsRemote SensingHydrodynamic ModellingHydrological ModellingData Assimilation
L
Laure Berti-Équille
ESPACE-DEV, French National Research Institute for Sustainable Development (IRD)
E
Emmanuel Roux
ESPACE-DEV, French National Research Institute for Sustainable Development (IRD)
J
Joris Guérin
ESPACE-DEV, French National Research Institute for Sustainable Development (IRD)