skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Using Object-Oriented Classification for Coastal Management in the East Central Coast of Florida: A Quantitative Comparison between UAV, Satellite, and Aerial Data
High resolution mapping of coastal habitats is invaluable for resource inventory, change detection, and inventory of aquaculture applications. However, coastal areas, especially the interior of mangroves, are often difficult to access. An Unmanned Aerial Vehicle (UAV), equipped with a multispectral sensor, affords an opportunity to improve upon satellite imagery for coastal management because of the very high spatial resolution, multispectral capability, and opportunity to collect real-time observations. Despite the recent and rapid development of UAV mapping applications, few articles have quantitatively compared how much improvement there is of UAV multispectral mapping methods compared to more conventional remote sensing data such as satellite imagery. The objective of this paper is to quantitatively demonstrate the improvements of a multispectral UAV mapping technique for higher resolution images used for advanced mapping and assessing coastal land cover. We performed multispectral UAV mapping fieldwork trials over Indian River Lagoon along the central Atlantic coast of Florida. Ground Control Points (GCPs) were collected to generate a rigorous geo-referenced dataset of UAV imagery and support comparison to geo-referenced satellite and aerial imagery. Multi-spectral satellite imagery (Sentinel-2) was also acquired to map land cover for the same region. NDVI and object-oriented classification methods were used for comparison between UAV and satellite mapping capabilities. Compared with aerial images acquired from Florida Department of Environmental Protection, the UAV multi-spectral mapping method used in this study provided advanced information of the physical conditions of the study area, an improved land feature delineation, and a significantly better mapping product than satellite imagery with coarser resolution. The study demonstrates a replicable UAV multi-spectral mapping method useful for study sites that lack high quality data.  more » « less
Award ID(s):
1829890
PAR ID:
10111963
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Drones
Volume:
3
Issue:
3
ISSN:
2504-446X
Page Range / eLocation ID:
60
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. State-of-the-art deep learning technology has been successfully applied to relatively small selected areas of very high spatial resolution (0.15 and 0.25 m) optical aerial imagery acquired by a fixed-wing aircraft to automatically characterize ice-wedge polygons (IWPs) in the Arctic tundra. However, any mapping of IWPs at regional to continental scales requires images acquired on different sensor platforms (particularly satellite) and a refined understanding of the performance stability of the method across sensor platforms through reliable evaluation assessments. In this study, we examined the transferability of a deep learning Mask Region-Based Convolutional Neural Network (R-CNN) model for mapping IWPs in satellite remote sensing imagery (~0.5 m) covering 272 km2 and unmanned aerial vehicle (UAV) (0.02 m) imagery covering 0.32 km2. Multi-spectral images were obtained from the WorldView-2 satellite sensor and pan-sharpened to ~0.5 m, and a 20 mp CMOS sensor camera onboard a UAV, respectively. The training dataset included 25,489 and 6022 manually delineated IWPs from satellite and fixed-wing aircraft aerial imagery near the Arctic Coastal Plain, northern Alaska. Quantitative assessments showed that individual IWPs were correctly detected at up to 72% and 70%, and delineated at up to 73% and 68% F1 score accuracy levels for satellite and UAV images, respectively. Expert-based qualitative assessments showed that IWPs were correctly detected at good (40–60%) and excellent (80–100%) accuracy levels for satellite and UAV images, respectively, and delineated at excellent (80–100%) level for both images. We found that (1) regardless of spatial resolution and spectral bands, the deep learning Mask R-CNN model effectively mapped IWPs in both remote sensing satellite and UAV images; (2) the model achieved a better accuracy in detection with finer image resolution, such as UAV imagery, yet a better accuracy in delineation with coarser image resolution, such as satellite imagery; (3) increasing the number of training data with different resolutions between the training and actual application imagery does not necessarily result in better performance of the Mask R-CNN in IWPs mapping; (4) and overall, the model underestimates the total number of IWPs particularly in terms of disjoint/incomplete IWPs. 
    more » « less
  2. Arctic landscapes are rapidly changing with climate warming. Vegetation communities are restructuring, which in turn impacts wildlife, permafrost, carbon cycling and climate feedbacks. Accurately monitoring vegetation change is thus crucial, but notable mismatches in scale occur between current field and satellite-based monitoring. Remote sensing from unmanned aerial vehicles (UAVs) has emerged as a bridge between field data and satellite imagery mapping. In this work we assess the viability of using high resolution UAV imagery (RGB and multispectral), along with UAV derived Structure from Motion (SfM) to predict cover, height and above-ground biomass of common Arctic plant functional types (PFTs) across a wide range of vegetation community types. We collected field data and UAV imagery from 45 sites across Alaska and northwest Canada. We then classified UAV imagery by PFT, estimated cover and height, and modeled biomass from UAV-derived volume estimates. Here we present datasets summarizing this data. 
    more » « less
  3. Arctic vegetation communities are rapidly changing with climate warming, which impacts wildlife, carbon cycling and climate feedbacks. Accurately monitoring vegetation change is thus crucial, but scale mismatches between field and satellite-based monitoring cause challenges. Remote sensing from unmanned aerial vehicles (UAVs) has emerged as a bridge between field data and satellite-based mapping. We assess the viability of using high resolution UAV imagery and UAV-derived Structure from Motion (SfM) to predict cover, height and aboveground biomass (henceforth biomass) of Arctic plant functional types (PFTs) across a range of vegetation community types. We classified imagery by PFT, estimated cover and height, and modeled biomass from UAV-derived volume estimates. Predicted values were compared to field estimates to assess results. Cover was estimated with root-mean-square error (RMSE) 6.29-14.2% and height was estimated with RMSE 3.29-10.5 cm, depending on the PFT. Total aboveground biomass was predicted with RMSE 220.5 g m-2, and per-PFT RMSE ranged from 17.14-164.3 g m-2. Deciduous and evergreen shrub biomass was predicted most accurately, followed by lichen, graminoid, and forb biomass. Our results demonstrate the effectiveness of using UAVs to map PFT biomass, which provides a link towards improved mapping of PFTs across large areas using earth observation satellite imagery. 
    more » « less
  4. Over the last century, direct human modification has been a major driver of coastal wetland degradation, resulting in widespread losses of wetland vegetation and a transition to open water. High-resolution satellite imagery is widely available for monitoring changes in present-day wetlands; however, understanding the rates of wetland vegetation loss over the last century depends on the use of historical panchromatic aerial photographs. In this study, we compared manual image thresholding and an automated machine learning (ML) method in detecting wetland vegetation and open water from historical panchromatic photographs in the Florida Everglades, a subtropical wetland landscape. We compared the same classes delineated in the historical photographs to 2012 multispectral satellite imagery and assessed the accuracy of detecting vegetation loss over a 72 year timescale (1940 to 2012) for a range of minimum mapping units (MMUs). Overall, classification accuracies were >95% across the historical photographs and satellite imagery, regardless of the classification method and MMUs. We detected a 2.3–2.7 ha increase in open water pixels across all change maps (overall accuracies > 95%). Our analysis demonstrated that ML classification methods can be used to delineate wetland vegetation from open water in low-quality, panchromatic aerial photographs and that a combination of images with different resolutions is compatible with change detection. The study also highlights how evaluating a range of MMUs can identify the effect of scale on detection accuracy and change class estimates as well as in determining the most relevant scale of analysis for the process of interest. 
    more » « less
  5. null (Ed.)
    In September of 2017, Hurricane Irma made landfall within the Rookery Bay National Estuarine Research Reserve of southwest Florida (USA) as a category 3 storm with winds in excess of 200 km h−1. We mapped the extent of the hurricane’s impact on coastal land cover with a seasonal time series of satellite imagery. Very high-resolution (i.e., <5 m pixel) satellite imagery has proven effective to map wetland ecosystems, but challenges in data acquisition and storage, algorithm training, and image processing have prevented large-scale and time-series mapping of these data. We describe our approach to address these issues to evaluate Rookery Bay ecosystem damage and recovery using 91 WorldView-2 satellite images collected between 2010 and 2018 mapped using automated techniques and validated with a field campaign. Land cover was classified seasonally at 2 m resolution (i.e., healthy mangrove, degraded mangrove, upland, soil, and water) with an overall accuracy of 82%. Digital change detection methods show that hurricane-related degradation was 17% of mangrove forest (~5 km2). Approximately 35% (1.7 km2) of this loss recovered one year after Hurricane Irma. The approach completed the mapping approximately 200 times faster than existing methods, illustrating the ease with which regional high-resolution mapping may be accomplished efficiently. 
    more » « less