BRIGHT: A globally distributed multimodal building damage assessment dataset with very-high-resolution for all-weather disaster response

📅 2025-01-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing disaster assessment methods fail under adverse conditions such as nighttime and cloud/fog cover, severely hindering post-disaster response efficiency. To address this, we introduce the first open-source, globally distributed, multimodal (optical + SAR), high-resolution (0.3–1 m) benchmark dataset for building damage assessment, covering five natural and two anthropogenic disaster types—specifically designed for developing-country contexts to fill the critical gap in all-weather, AI-driven disaster evaluation data. Leveraging this dataset, we systematically evaluate seven state-of-the-art models, rigorously assessing their cross-event generalization and robustness. Experimental results demonstrate that multimodal fusion substantially improves damage detection accuracy under cloud/fog and nighttime conditions. The dataset, along with associated code, is publicly released and serves as the official benchmark for the 2025 IEEE GRSS Data Fusion Contest.

Technology Category

Application Category

📝 Abstract
Disaster events occur around the world and cause significant damage to human life and property. Earth observation (EO) data enables rapid and comprehensive building damage assessment (BDA), an essential capability in the aftermath of a disaster to reduce human casualties and to inform disaster relief efforts. Recent research focuses on the development of AI models to achieve accurate mapping of unseen disaster events, mostly using optical EO data. However, solutions based on optical data are limited to clear skies and daylight hours, preventing a prompt response to disasters. Integrating multimodal (MM) EO data, particularly the combination of optical and SAR imagery, makes it possible to provide all-weather, day-and-night disaster responses. Despite this potential, the development of robust multimodal AI models has been constrained by the lack of suitable benchmark datasets. In this paper, we present a BDA dataset using veRy-hIGH-resoluTion optical and SAR imagery (BRIGHT) to support AI-based all-weather disaster response. To the best of our knowledge, BRIGHT is the first open-access, globally distributed, event-diverse MM dataset specifically curated to support AI-based disaster response. It covers five types of natural disasters and two types of man-made disasters across 12 regions worldwide, with a particular focus on developing countries where external assistance is most needed. The optical and SAR imagery in BRIGHT, with a spatial resolution between 0.3-1 meters, provides detailed representations of individual buildings, making it ideal for precise BDA. In our experiments, we have tested seven advanced AI models trained with our BRIGHT to validate the transferability and robustness. The dataset and code are available at https://github.com/ChenHongruixuan/BRIGHT. BRIGHT also serves as the official dataset for the 2025 IEEE GRSS Data Fusion Contest.
Problem

Research questions and friction points this paper is trying to address.

Disaster Assessment
All-Weather
Building Damage Evaluation
Innovation

Methods, ideas, or system contributions that make the work stand out.

BRIGHT Dataset
Multi-source Earth Observation
Disaster Damage Assessment
🔎 Similar Papers
No similar papers found.