skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: FLAME 3 - Radiometric Thermal UAV Imagery for Wildfire Management
FLAME 3 is the third dataset in the FLAME series of aerial UAV-collected side-by-side multi-spectral wildlands fire imagery (see FLAME 1 and FLAME 2). This set contains a single-burn subset of the larger FLAME 3 dataset focusing specifically on Computer Vision tasks such as fire detection and segmentation. Included are 622 image quartets labeled Fire and 116 image quartets labeled No Fire. The No Fire images are of the surrounding forestry of the prescribed burn plot. Each image quartet is composed of four images - a raw RGB image, a raw thermal image, a corrected FOV RGB image, and a thermal TIFF. Each of the four data types are detailed in the below Table 1. More information on data collection methods, data processing procedures, and data labeling can be found in https://arxiv.org/abs/2412.02831. This dataset also contains a NADIR Thermal Fire set, providing georeferenced overhead thermal imagery, captured by UAV every 3-5 seconds, focusing on monitoring fire progression and burn behaviors over time. This data, when processed, enables centimeter-grade measurements of fire spread and energy release over time. Pre, post, and during burn imagery are included, along with ground control point (GCP) data.  This dataset is based on the research conducted in the paper: FLAME 3 Dataset: Unleashing the Power of Radiometric Thermal UAV Imagery for Wildfire Management. It provides detailed insights and analysis related to forest fire monitoring and modeling.If you use this dataset in your research or projects, please cite the original paper as follows: APA: Hopkins, B., ONeill, L., Marinaccio, M., Rowell, E., Parsons, R., Flanary, S., Nazim I, Seielstad C, Afghah, F. (2024). FLAME 3 Dataset: Unleashing the Power of Radiometric Thermal UAV Imagery for Wildfire Management. arXiv preprint arXiv:2412.02831.BibTeX: @misc{hopkins2024flame3datasetunleashing, title={FLAME 3 Dataset: Unleashing the Power of Radiometric Thermal UAV Imagery for Wildfire Management}, author={Bryce Hopkins and Leo ONeill and Michael Marinaccio and Eric Rowell and Russell Parsons and Sarah Flanary and Irtija Nazim and Carl Seielstad and Fatemeh Afghah}, year={2024}, eprint={2412.02831}, archivePrefix={arXiv}, primaryClass={cs.CV}, url={https://arxiv.org/abs/2412.02831}, }  more » « less
Award ID(s):
2204445
PAR ID:
10588422
Author(s) / Creator(s):
; ; ; ; ; ;
Publisher / Repository:
IEEE DataPort
Date Published:
Format(s):
Medium: X
Right(s):
Creative Commons Attribution 4.0 International
Sponsoring Org:
National Science Foundation
More Like this
  1. Drone based wildfire detection and modeling methods enable high-precision, real-time fire monitoring that is not provided by traditional remote fire monitoring systems, such as satellite imaging. Precise, real-time information enables rapid, effective wildfire intervention and management strategies. Drone systems’ ease of deployment, omnidirectional maneuverability, and robust sensing capabilities make them effective tools for early wildfire detection and evaluation, particularly so in environments that are inconvenient for humans and/or terrestrial vehicles. Development of emerging drone-based fire monitoring systems has been inhibited by a lack of well-annotated, high quality aerial wildfire datasets, largely as a result of UAV flight regulations for prescribed burns and wildfires. The included dataset provides a collection of side-by-side infrared and visible spectrum video pairs taken by drones during an open canopy prescribed fire in Northern Arizona in 2021. The frames have been classified by two independent classifiers with two binary classifications. The Fire label is applied when the classifiers visually observe indications of fire in either RGB or IR frame for each frame pair. The Smoke label is applied when the classifiers visually estimate that at least 50% of the RGB frame is filled with smoke. To provide additional context to the main dataset’s aerial imagery, the provided supplementary dataset includes weather information, the prescribed burn plan, a geo-referenced RGB point cloud of the preburn area, an RGB orthomosaic of the preburn area, and links to further information. 
    more » « less
  2. Current forest monitoring technologies including satellite remote sensing, manned/piloted aircraft, and observation towers leave uncertainties about a wildfire’s extent, behavior, and conditions in the fire’s near environment, particularly during its early growth. Rapid mapping and real-time fire monitoring can inform in-time intervention or management solutions to maximize beneficial fire outcomes. Drone systems’ unique features of 3D mobility, low flight altitude, and fast and easy deployment make them a valuable tool for early detection and assessment of wildland fires, especially in remote forests that are not easily accessible by ground vehicles. In addition, the lack of abundant, well-annotated aerial datasets – in part due to unmanned aerial vehicles’ (UAVs’) flight restrictions during prescribed burns and wildfires – has limited research advances in reliable data-driven fire detection and modeling techniques. While existing wildland fire datasets often include either color or thermal fire images, here we present (1) a multi-modal UAV-collected dataset of dual-feed side-by-side videos including both RGB and thermal images of a prescribed fire in an open canopy pine forest in Northern Arizona and (2) a deep learning-based methodology for detecting fire and smoke pixels at accuracy much higher than the usual single-channel video feeds. The collected images are labeled to “fire” or “no-fire” frames by two human experts using side-by-side RGB and thermal images to determine the label. To provide context to the main dataset’s aerial imagery, the included supplementary dataset provides a georeferenced pre-burn point cloud, an RGB orthomosaic, weather information, a burn plan, and other burn information. By using and expanding on this guide dataset, research can develop new data-driven fire detection, fire segmentation, and fire modeling techniques. 
    more » « less
  3. Wildfires are one of the deadliest and dangerous natural disasters in the world. Wildfires burn millions of forests and they put many lives of humans and animals in danger. Predicting fire behavior can help firefighters to have better fire management and scheduling for future incidents and also it reduces the life risks for the firefighters. Recent advance in aerial images shows that they can be beneficial in wildfire studies. Among different methods and technologies for aerial images, Unmanned Aerial Vehicles (UAVs) and drones are beneficial to collect information regarding the fire. This study provides an aerial imagery dataset using drones during a prescribed pile fire in Northern Arizona, USA. This dataset consists of different repositories including raw aerial videos recorded by drones' cameras and also raw heatmap footage recorded by an infrared thermal camera. To help researchers, two well-known studies; fire classification and fire segmentation are defined based on the dataset. For approaches such as Neural Networks (NNs) and fire classification, 39,375 frames are labeled ("Fire" vs "Non-Fire") for the training phase. Also, another 8,617 frames are labeled for the test data. 2,003 frames are considered for the fire segmentation and regarding that, 2,003 masks are generated for the purpose of Ground Truth data with pixel-wise annotation. 
    more » « less
  4. Abstract Circum-boreal and -tundra systems are crucial carbon pools that are experiencing amplified warming and are at risk of increasing wildfire activity. Changes in wildfire activity have broad implications for vegetation dynamics, underlying permafrost soils, and ultimately, carbon cycling. However, understanding wildfire effects on biophysical processes across eastern Siberian taiga and tundra remains challenging because of the lack of an easily accessible annual fire perimeter database and underestimation of area burned by MODIS satellite imagery. To better understand wildfire dynamics over the last 20 years in this region, we mapped area burned, generated a fire perimeter database, and characterized fire regimes across eight ecozones spanning 7.8 million km2of eastern Siberian taiga and tundra from ∼61–72.5° N and 100° E–176° W using long-term satellite data from Landsat, processed via Google Earth Engine. We generated composite images for the annual growing season (May–September), which allowed mitigation of missing data from snow-cover, cloud-cover, and the Landsat 7 scan line error. We used annual composites to calculate the difference Normalized Burn Ratio (dNBR) for each year. The annual dNBR images were converted to binary burned or unburned imagery that was used to vectorize fire perimeters. We mapped 22 091 fires burning 152 million hectares (Mha) over 20 years. Although 2003 was the largest fire year on record, 2020 was an exceptional fire year for four of the northeastern ecozones resulting in substantial increases in fire activity above the Arctic Circle. Increases in fire extent, severity, and frequency with continued climate warming will impact vegetation and permafrost dynamics with increased likelihood of irreversible permafrost thaw that leads to increased carbon release and/or conversion of forest to shrublands. 
    more » « less
  5. null (Ed.)
    Paper is under review. Available on arxiv: https://arxiv.org/abs/2104.05957 
    more » « less