Real-Time Wildfire Localization on the NASA Autonomous Modular Sensor Using Deep Learning

πŸ“… 2026-01-08
πŸ›οΈ AIAA SCITECH 2026 Forum
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This study addresses the scarcity and high cost of high-altitude multispectral aerial imagery, which have hindered the development of effective wildfire detection algorithms. To overcome this limitation, the authors construct a 12-channel multispectral wildfire dataset based on NASA’s AMS sensor and propose the first end-to-end real-time deep learning model tailored for mid- to high-altitude multispectral aerial imagery. The model integrates shortwave infrared, infrared, and thermal infrared bands to simultaneously perform image classification and pixel-level segmentation. It achieves robust wildfire boundary localization under challenging conditions such as nighttime and obscuration by smoke or clouds, while effectively suppressing false positives. Experimental results demonstrate a classification accuracy of 96%, an intersection-over-union (IoU) of 74%, and a recall of 84%, significantly outperforming existing approaches based on satellite data or conventional color-based rules.

Technology Category

Application Category

πŸ“ Abstract
High-altitude, multi-spectral, aerial imagery is scarce and expensive to acquire, yet it is necessary for algorithmic advances and application of machine learning models to high-impact problems such as wildfire detection. We introduce a human-annotated dataset from the NASA Autonomous Modular Sensor (AMS) using 12-channel, medium to high altitude (3 - 50 km) aerial wildfire images similar to those used in current US wildfire missions. Our dataset combines spectral data from 12 different channels, including infrared (IR), short-wave IR (SWIR), and thermal. We take imagery from 20 wildfire missions and randomly sample small patches to generate over 4000 images with high variability, including occlusions by smoke/clouds, easily-confused false positives, and nighttime imagery. We demonstrate results from a deep-learning model to automate the human-intensive process of fire perimeter determination. We train two deep neural networks, one for image classification and the other for pixel-level segmentation. The networks are combined into a unique real-time segmentation model to efficiently localize active wildfire on an incoming image feed. Our model achieves 96% classification accuracy, 74% Intersection-over-Union(IoU), and 84% recall surpassing past methods, including models trained on satellite data and classical color-rule algorithms. By leveraging a multi-spectral dataset, our model is able to detect active wildfire at nighttime and behind clouds, while distinguishing between false positives. We find that data from the SWIR, IR, and thermal bands is the most important to distinguish fire perimeters. Our code and dataset can be found here: https://github.com/nasa/Autonomous-Modular-Sensor-Wildfire-Segmentation/tree/main and https://drive.google.com/drive/folders/1-u4vs9rqwkwgdeeeoUhftCxrfe_4QPTn?=usp=drive_link
Problem

Research questions and friction points this paper is trying to address.

wildfire detection
real-time localization
multi-spectral imagery
fire perimeter determination
aerial remote sensing
Innovation

Methods, ideas, or system contributions that make the work stand out.

multi-spectral imagery
real-time wildfire segmentation
deep learning
NASA Autonomous Modular Sensor
fire perimeter detection
πŸ”Ž Similar Papers
No similar papers found.
Y
Yajvan Ravan
OSTEM Internship Program, Langley Research Center
A
Aref Malek
OSTEM Internship Program, Langley Research Center
C
Chester V. Dolph
Aerospace Engineer, Aeronautics Systems Engineering Branch, AIAA Member.
Nikhil Behari
Nikhil Behari
Massachusetts Institute of Technology
computer visionmachine learningartificial intelligence