skip to main content


Title: Multisensor UAS mapping of Plant Species and Plant Functional Types in Midwestern Grasslands
Uncrewed aerial systems (UASs) have emerged as powerful ecological observation platforms capable of filling critical spatial and spectral observation gaps in plant physiological and phenological traits that have been difficult to measure from space-borne sensors. Despite recent technological advances, the high cost of drone-borne sensors limits the widespread application of UAS technology across scientific disciplines. Here, we evaluate the tradeoffs between off-the-shelf and sophisticated drone-borne sensors for mapping plant species and plant functional types (PFTs) within a diverse grassland. Specifically, we compared species and PFT mapping accuracies derived from hyperspectral, multispectral, and RGB imagery fused with light detection and ranging (LiDAR) or structure-for-motion (SfM)-derived canopy height models (CHM). Sensor–data fusion were used to consider either a single observation period or near-monthly observation frequencies for integration of phenological information (i.e., phenometrics). Results indicate that overall classification accuracies for plant species and PFTs were highest in hyperspectral and LiDAR-CHM fusions (78 and 89%, respectively), followed by multispectral and phenometric–SfM–CHM fusions (52 and 60%, respectively) and RGB and SfM–CHM fusions (45 and 47%, respectively). Our findings demonstrate clear tradeoffs in mapping accuracies from economical versus exorbitant sensor networks but highlight that off-the-shelf multispectral sensors may achieve accuracies comparable to those of sophisticated UAS sensors by integrating phenometrics into machine learning image classifiers.  more » « less
Award ID(s):
1928048
NSF-PAR ID:
10388607
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Remote Sensing
Volume:
14
Issue:
14
ISSN:
2072-4292
Page Range / eLocation ID:
3453
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Arctic landscapes are rapidly changing with climate warming. Vegetation communities are restructuring, which in turn impacts wildlife, permafrost, carbon cycling and climate feedbacks. Accurately monitoring vegetation change is thus crucial, but notable mismatches in scale occur between current field and satellite-based monitoring. Remote sensing from unmanned aerial vehicles (UAVs) has emerged as a bridge between field data and satellite imagery mapping. In this work we assess the viability of using high resolution UAV imagery (RGB and multispectral), along with UAV derived Structure from Motion (SfM) to predict cover, height and above-ground biomass of common Arctic plant functional types (PFTs) across a wide range of vegetation community types. We collected field data and UAV imagery from 45 sites across Alaska and northwest Canada. We then classified UAV imagery by PFT, estimated cover and height, and modeled biomass from UAV-derived volume estimates. Here we present datasets summarizing this data. 
    more » « less
  2. null (Ed.)
    Mapping invasive vegetation species in arid regions is a critical task for managing water resources and understanding threats to ecosystem services. Traditional remote sensing platforms, such as Landsat and MODIS, are ill-suited for distinguishing native and non-native vegetation species in arid regions due to their large pixels compared to plant sizes. Unmanned aircraft systems, or UAS, offer the potential to capture the high spatial resolution imagery needed to differentiate species. However, in order to extract the most benefits from these platforms, there is a need to develop more efficient and effective workflows. This paper presents an integrated spectral–structural workflow for classifying invasive vegetation species in the Lower Salt River region of Arizona, which has been the site of fires and flooding, leading to a proliferation of invasive vegetation species. Visible (RGB) and multispectral images were captured and processed following a typical structure from motion workflow, and the derived datasets were used as inputs in two machine learning classifications—one incorporating only spectral information and one utilizing both spectral data and structural layers (e.g., digital terrain model (DTM) and canopy height model (CHM)). Results show that including structural layers in the classification improved overall accuracy from 80% to 93% compared to the spectral-only model. The most important features for classification were the CHM and DTM, with the blue band and two spectral indices (normalized difference water index (NDWI) and normalized difference salinity index (NDSI)) contributing important spectral information to both models. 
    more » « less
  3. null (Ed.)
    Accurately mapping tree species composition and diversity is a critical step towards spatially explicit and species-specific ecological understanding. The National Ecological Observatory Network (NEON) is a valuable source of open ecological data across the United States. Freely available NEON data include in-situ measurements of individual trees, including stem locations, species, and crown diameter, along with the NEON Airborne Observation Platform (AOP) airborne remote sensing imagery, including hyperspectral, multispectral, and light detection and ranging (LiDAR) data products. An important aspect of predicting species using remote sensing data is creating high-quality training sets for optimal classification purposes. Ultimately, manually creating training data is an expensive and time-consuming task that relies on human analyst decisions and may require external data sets or information. We combine in-situ and airborne remote sensing NEON data to evaluate the impact of automated training set preparation and a novel data preprocessing workflow on classifying the four dominant subalpine coniferous tree species at the Niwot Ridge Mountain Research Station forested NEON site in Colorado, USA. We trained pixel-based Random Forest (RF) machine learning models using a series of training data sets along with remote sensing raster data as descriptive features. The highest classification accuracies, 69% and 60% based on internal RF error assessment and an independent validation set, respectively, were obtained using circular tree crown polygons created with half the maximum crown diameter per tree. LiDAR-derived data products were the most important features for species classification, followed by vegetation indices. This work contributes to the open development of well-labeled training data sets for forest composition mapping using openly available NEON data without requiring external data collection, manual delineation steps, or site-specific parameters. 
    more » « less
  4. Airborne remote sensing offers unprecedented opportunities to efficiently monitor vegetation, but methods to delineate and classify individual plant species using the collected data are still actively being developed and improved. The Integrating Data science with Trees and Remote Sensing (IDTReeS) plant identification competition openly invited scientists to create and compare individual tree mapping methods. Participants were tasked with training taxon identification algorithms based on two sites, to then transfer their methods to a third unseen site, using field-based plant observations in combination with airborne remote sensing image data products from the National Ecological Observatory Network (NEON). These data were captured by a high resolution digital camera sensitive to red, green, blue (RGB) light, hyperspectral imaging spectrometer spanning the visible to shortwave infrared wavelengths, and lidar systems to capture the spectral and structural properties of vegetation. As participants in the IDTReeS competition, we developed a two-stage deep learning approach to integrate NEON remote sensing data from all three sensors and classify individual plant species and genera. The first stage was a convolutional neural network that generates taxon probabilities from RGB images, and the second stage was a fusion neural network that “learns” how to combine these probabilities with hyperspectral and lidar data. Our two-stage approach leverages the ability of neural networks to flexibly and automatically extract descriptive features from complex image data with high dimensionality. Our method achieved an overall classification accuracy of 0.51 based on the training set, and 0.32 based on the test set which contained data from an unseen site with unknown taxa classes. Although transferability of classification algorithms to unseen sites with unknown species and genus classes proved to be a challenging task, developing methods with openly available NEON data that will be collected in a standardized format for 30 years allows for continual improvements and major gains for members of the computational ecology community. We outline promising directions related to data preparation and processing techniques for further investigation, and provide our code to contribute to open reproducible science efforts. 
    more » « less
  5. Arctic vegetation communities are rapidly changing with climate warming, which impacts wildlife, carbon cycling and climate feedbacks. Accurately monitoring vegetation change is thus crucial, but scale mismatches between field and satellite-based monitoring cause challenges. Remote sensing from unmanned aerial vehicles (UAVs) has emerged as a bridge between field data and satellite-based mapping. We assess the viability of using high resolution UAV imagery and UAV-derived Structure from Motion (SfM) to predict cover, height and aboveground biomass (henceforth biomass) of Arctic plant functional types (PFTs) across a range of vegetation community types. We classified imagery by PFT, estimated cover and height, and modeled biomass from UAV-derived volume estimates. Predicted values were compared to field estimates to assess results. Cover was estimated with root-mean-square error (RMSE) 6.29-14.2% and height was estimated with RMSE 3.29-10.5 cm, depending on the PFT. Total aboveground biomass was predicted with RMSE 220.5 g m-2, and per-PFT RMSE ranged from 17.14-164.3 g m-2. Deciduous and evergreen shrub biomass was predicted most accurately, followed by lichen, graminoid, and forb biomass. Our results demonstrate the effectiveness of using UAVs to map PFT biomass, which provides a link towards improved mapping of PFTs across large areas using earth observation satellite imagery. 
    more » « less