skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Thursday, February 13 until 2:00 AM ET on Friday, February 14 due to maintenance. We apologize for the inconvenience.


Title: Wildland Fire Detection and Monitoring Using a Drone-Collected RGB/IR Image Dataset
Current forest monitoring technologies including satellite remote sensing, manned/piloted aircraft, and observation towers leave uncertainties about a wildfire’s extent, behavior, and conditions in the fire’s near environment, particularly during its early growth. Rapid mapping and real-time fire monitoring can inform in-time intervention or management solutions to maximize beneficial fire outcomes. Drone systems’ unique features of 3D mobility, low flight altitude, and fast and easy deployment make them a valuable tool for early detection and assessment of wildland fires, especially in remote forests that are not easily accessible by ground vehicles. In addition, the lack of abundant, well-annotated aerial datasets – in part due to unmanned aerial vehicles’ (UAVs’) flight restrictions during prescribed burns and wildfires – has limited research advances in reliable data-driven fire detection and modeling techniques. While existing wildland fire datasets often include either color or thermal fire images, here we present (1) a multi-modal UAV-collected dataset of dual-feed side-by-side videos including both RGB and thermal images of a prescribed fire in an open canopy pine forest in Northern Arizona and (2) a deep learning-based methodology for detecting fire and smoke pixels at accuracy much higher than the usual single-channel video feeds. The collected images are labeled to “fire” or “no-fire” frames by two human experts using side-by-side RGB and thermal images to determine the label. To provide context to the main dataset’s aerial imagery, the included supplementary dataset provides a georeferenced pre-burn point cloud, an RGB orthomosaic, weather information, a burn plan, and other burn information. By using and expanding on this guide dataset, research can develop new data-driven fire detection, fire segmentation, and fire modeling techniques.  more » « less
Award ID(s):
2120485
PAR ID:
10467303
Author(s) / Creator(s):
Publisher / Repository:
IEEE Access
Date Published:
Volume:
10
Page Range / eLocation ID:
121301-121317
Subject(s) / Keyword(s):
autonomous aerial vehicles deep learning (artificial intelligence) forestry geophysical image processing image colour analysis image fusion infrared imaging object detection remote sensing robot vision video signal processing wildfires
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Drone based wildfire detection and modeling methods enable high-precision, real-time fire monitoring that is not provided by traditional remote fire monitoring systems, such as satellite imaging. Precise, real-time information enables rapid, effective wildfire intervention and management strategies. Drone systems’ ease of deployment, omnidirectional maneuverability, and robust sensing capabilities make them effective tools for early wildfire detection and evaluation, particularly so in environments that are inconvenient for humans and/or terrestrial vehicles. Development of emerging drone-based fire monitoring systems has been inhibited by a lack of well-annotated, high quality aerial wildfire datasets, largely as a result of UAV flight regulations for prescribed burns and wildfires. The included dataset provides a collection of side-by-side infrared and visible spectrum video pairs taken by drones during an open canopy prescribed fire in Northern Arizona in 2021. The frames have been classified by two independent classifiers with two binary classifications. The Fire label is applied when the classifiers visually observe indications of fire in either RGB or IR frame for each frame pair. The Smoke label is applied when the classifiers visually estimate that at least 50% of the RGB frame is filled with smoke. To provide additional context to the main dataset’s aerial imagery, the provided supplementary dataset includes weather information, the prescribed burn plan, a geo-referenced RGB point cloud of the preburn area, an RGB orthomosaic of the preburn area, and links to further information. 
    more » « less
  2. Wildfires are one of the deadliest and dangerous natural disasters in the world. Wildfires burn millions of forests and they put many lives of humans and animals in danger. Predicting fire behavior can help firefighters to have better fire management and scheduling for future incidents and also it reduces the life risks for the firefighters. Recent advance in aerial images shows that they can be beneficial in wildfire studies. Among different methods and technologies for aerial images, Unmanned Aerial Vehicles (UAVs) and drones are beneficial to collect information regarding the fire. This study provides an aerial imagery dataset using drones during a prescribed pile fire in Northern Arizona, USA. This dataset consists of different repositories including raw aerial videos recorded by drones' cameras and also raw heatmap footage recorded by an infrared thermal camera. To help researchers, two well-known studies; fire classification and fire segmentation are defined based on the dataset. For approaches such as Neural Networks (NNs) and fire classification, 39,375 frames are labeled ("Fire" vs "Non-Fire") for the training phase. Also, another 8,617 frames are labeled for the test data. 2,003 frames are considered for the fire segmentation and regarding that, 2,003 masks are generated for the purpose of Ground Truth data with pixel-wise annotation. 
    more » « less
  3. This data set contains the raw files from flight RU_ALN_TR1_FL007R. The remote sensing imagery is collected using uncrewed aerial vehicles at a series of fire perimeters in larch forests located in northeastern Siberia in 2018 and 2019. Images were collected using visible sensors (blue, green, and red wavelengths) and multispectral sensors (green, red, red-edge, and near-infrared wavelengths). The data were collected perpendicular to fire perimeter boundaries in order to characterize variation vegetation composition and structure between burned and burned forests, and as a function of distance from the unburned forest edge. The resulting images are co-located with field observations of ecosystem properties collected as part of this project that are posted in a related data set (Alexander et al, 2018). Heather Alexander, Jennie DeMarco, Rebecca Hewitt, Jeremy Lichstein, Michael Loranty, et al. 2018. Fire influences on forest recovery and associated climate feedbacks in Siberian Larch Forests, Russia, June-July 2018. Arctic Data Center. urn:uuid:a5de1514-78d3-449f-aad1-2ff8f8d0fb27. 
    more » « less
  4. This data set contains the raw files from flight RU_ALN_TR1_FL007R. The remote sensing imagery is collected using uncrewed aerial vehicles at a series of fire perimeters in larch forests located in northeastern Siberia in 2018 and 2019. Images were collected using visible sensors (blue, green, and red wavelengths) and multispectral sensors (green, red, red-edge, and near-infrared wavelengths). The data were collected perpendicular to fire perimeter boundaries in order to characterize variation vegetation composition and structure between burned and burned forests, and as a function of distance from the unburned forest edge. The resulting images are co-located with field observations of ecosystem properties collected as part of this project that are posted in a related data set (Alexander et al, 2018). Heather Alexander, Jennie DeMarco, Rebecca Hewitt, Jeremy Lichstein, Michael Loranty, et al. 2018. Fire influences on forest recovery and associated climate feedbacks in Siberian Larch Forests, Russia, June-July 2018. Arctic Data Center. urn:uuid:a5de1514-78d3-449f-aad1-2ff8f8d0fb27. 
    more » « less
  5. This data set contains the raw files from flight RU_ALN_TR1_FL007R. The remote sensing imagery is collected using uncrewed aerial vehicles at a series of fire perimeters in larch forests located in northeastern Siberia in 2018 and 2019. Images were collected using visible sensors (blue, green, and red wavelengths) and multispectral sensors (green, red, red-edge, and near-infrared wavelengths). The data were collected perpendicular to fire perimeter boundaries in order to characterize variation vegetation composition and structure between burned and burned forests, and as a function of distance from the unburned forest edge. The resulting images are co-located with field observations of ecosystem properties collected as part of this project that are posted in a related data set (Alexander et al, 2018). Heather Alexander, Jennie DeMarco, Rebecca Hewitt, Jeremy Lichstein, Michael Loranty, et al. 2018. Fire influences on forest recovery and associated climate feedbacks in Siberian Larch Forests, Russia, June-July 2018. Arctic Data Center. urn:uuid:a5de1514-78d3-449f-aad1-2ff8f8d0fb27. 
    more » « less