The use of small unmanned aerial system (UAS)-based structure-from-motion (SfM; photogrammetry) and LiDAR point clouds has been widely discussed in the remote sensing community. Here, we compared multiple aspects of the SfM and the LiDAR point clouds, collected concurrently in five UAS flights experimental fields of a short crop (snap bean), in order to explore how well the SfM approach performs compared with LiDAR for crop phenotyping. The main methods include calculating the cloud-to-mesh distance (C2M) maps between the preprocessed point clouds, as well as computing a multiscale model-to-model cloud comparison (M3C2) distance maps between the derived digital elevation models (DEMs) and crop height models (CHMs). We also evaluated the crop height and the row width from the CHMs and compared them with field measurements for one of the data sets. Both SfM and LiDAR point clouds achieved an average RMSE of ~0.02 m for crop height and an average RMSE of ~0.05 m for row width. The qualitative and quantitative analyses provided proof that the SfM approach is comparable to LiDAR under the same UAS flight settings. However, its altimetric accuracy largely relied on the number and distribution of the ground control points.
more »
« less
Canopy Roughness: A New Phenotypic Trait to Estimate Aboveground Biomass from Unmanned Aerial System
Cost-effective phenotyping methods are urgently needed to advance crop genetics in order to meet the food, fuel, and fiber demands of the coming decades. Concretely, characterizing plot level traits in fields is of particular interest. Recent developments in high-resolution imaging sensors for UAS (unmanned aerial systems) focused on collecting detailed phenotypic measurements are a potential solution. We introduce canopy roughness as a new plant plot-level trait. We tested its usability with soybean by optical data collected from UAS to estimate biomass. We validate canopy roughness on a panel of 108 soybean [Glycine max (L.) Merr.] recombinant inbred lines in a multienvironment trial during the R2 growth stage. A senseFly eBee UAS platform obtained aerial images with a senseFly S.O.D.A. compact digital camera. Using a structure from motion (SfM) technique, we reconstructed 3D point clouds of the soybean experiment. A novel pipeline for feature extraction was developed to compute canopy roughness from point clouds. We used regression analysis to correlate canopy roughness with field-measured aboveground biomass (AGB) with a leave-one-out cross-validation. Overall, our models achieved a coefficient of determination ( R 2 ) greater than 0.5 in all trials. Moreover, we found that canopy roughness has the ability to discern AGB variations among different genotypes. Our test trials demonstrate the potential of canopy roughness as a reliable trait for high-throughput phenotyping to estimate AGB. As such, canopy roughness provides practical information to breeders in order to select phenotypes on the basis of UAS data.
more »
« less
- Award ID(s):
- 1845760
- PAR ID:
- 10207514
- Date Published:
- Journal Name:
- Plant Phenomics
- Volume:
- 2020
- ISSN:
- 2643-6515
- Page Range / eLocation ID:
- 1 to 10
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Soybean (Glycine max[L.] Merr.) production is susceptible to biotic and abiotic stresses, exacerbated by extreme weather events. Water limiting stress, that is, drought, emerges as a significant risk for soybean production, underscoring the need for advancements in stress monitoring for crop breeding and production. This project combined multi‐modal information to identify the most effective and efficient automated methods to study drought response. We investigated a set of diverse soybean accessions using multiple sensors in a time series high‐throughput phenotyping manner to: (1) develop a pipeline for rapid classification of soybean drought stress symptoms, and (2) investigate methods for early detection of drought stress. We utilized high‐throughput time‐series phenotyping using unmanned aerial vehicles and sensors in conjunction with machine learning analytics, which offered a swift and efficient means of phenotyping. The visible bands were most effective in classifying the severity of canopy wilting stress after symptom emergence. Non‐visual bands in the near‐infrared region and short‐wave infrared region contribute to the differentiation of susceptible and tolerant soybean accessions prior to visual symptom development. We report pre‐visual detection of soybean wilting using a combination of different vegetation indices and spectral bands, especially in the red‐edge. These results can contribute to early stress detection methodologies and rapid classification of drought responses for breeding and production applications.more » « less
-
Advances in imaging hardware allow high throughput capture of the detailed three-dimensional (3D) structure of plant canopies. The point cloud data is typically post-processed to extract coarse-scale geometric features (like volume, surface area, height, etc.) for downstream analysis. We extend feature extraction from 3D point cloud data to various additional features, which we denote as ‘canopy fingerprints’. This is motivated by the successful application of the fingerprint concept for molecular fingerprints in chemistry applications and acoustic fingerprints in sound engineering applications. We developed an end-to-end pipeline to generate canopy fingerprints of a three-dimensional point cloud of soybean [Glycine max(L.) Merr.] canopies grown in hill plots captured by a terrestrial laser scanner (TLS). The pipeline includes noise removal, registration, and plot extraction, followed by the canopy fingerprint generation. The canopy fingerprints are generated by splitting the data into multiple sub-canopy scale components and extracting sub-canopy scale geometric features. The generated canopy fingerprints are interpretable and can assist in identifying patterns in a database of canopies, querying similar canopies, or identifying canopies with a certain shape. The framework can be extended to other modalities (for instance, hyperspectral point clouds) and tuned to find the most informative fingerprint representation for downstream tasks. These canopy fingerprints can aid in the utilization of canopy traits at previously unutilized scales, and therefore have applications in plant breeding and resilient crop production.more » « less
-
Automated canopy stress classification for field crops has traditionally relied on single-perspective, two-dimensional (2D) photographs, usually obtained through top-view imaging using unmanned aerial vehicles (UAVs). However, this approach may fail to capture the full extent of plant stress symptoms, which can manifest throughout the canopy. Recent advancements in LiDAR technologies have enabled the acquisition of high-resolution 3D point cloud data for the entire canopy, offering new possibilities for more accurate plant stress identification and rating. This study explores the potential of leveraging 3D point cloud data for improved plant stress assessment. We utilized a dataset of RGB 3D point clouds of 700 soybean plants from a diversity panel exposed to iron deficiency chlorosis (IDC) stress. From this unique set of 700 canopies exhibiting varying levels of IDC, we extracted several representations, including (a) handcrafted IDC symptom-specific features, (b) canopy fingerprints, and (c) latent feature-based features. Subsequently, we trained several classification models to predict plant stress severity using these representations. We exhaustively investigated several stress representations and model combinations for the 3-D data. We also compared the performance of these classification models against similar models that are only trained using the associated top-view 2D RGB image for each plant. Among the feature-model combinations tested, the 3D canopy fingerprint features trained with a support vector machine yielded the best performance, achieving higher classification accuracy than the best-performing model based on 2D data built using convolutional neural networks. Our findings demonstrate the utility of color canopy fingerprinting and underscore the importance of considering 3D data to assess plant stress in agricultural applications.more » « less
-
Abstract Non‐forest ecosystems, dominated by shrubs, grasses and herbaceous plants, provide ecosystem services including carbon sequestration and forage for grazing, and are highly sensitive to climatic changes. Yet these ecosystems are poorly represented in remotely sensed biomass products and are undersampled by in situ monitoring. Current global change threats emphasize the need for new tools to capture biomass change in non‐forest ecosystems at appropriate scales. Here we developed and deployed a new protocol for photogrammetric height using unoccupied aerial vehicle (UAV) images to test its capability for delivering standardized measurements of biomass across a globally distributed field experiment. We assessed whether canopy height inferred from UAV photogrammetry allows the prediction of aboveground biomass (AGB) across low‐stature plant species by conducting 38 photogrammetric surveys over 741 harvested plots to sample 50 species. We found mean canopy height was strongly predictive of AGB across species, with a median adjustedR2of 0.87 (ranging from 0.46 to 0.99) and median prediction error from leave‐one‐out cross‐validation of 3.9%. Biomass per‐unit‐of‐height was similarwithinbut differentamong,plant functional types. We found that photogrammetric reconstructions of canopy height were sensitive to wind speed but not sun elevation during surveys. We demonstrated that our photogrammetric approach produced generalizable measurements across growth forms and environmental settings and yielded accuracies as good as those obtained from in situ approaches. We demonstrate that using a standardized approach for UAV photogrammetry can deliver accurate AGB estimates across a wide range of dynamic and heterogeneous ecosystems. Many academic and land management institutions have the technical capacity to deploy these approaches over extents of 1–10 ha−1. Photogrammetric approaches could provide much‐needed information required to calibrate and validate the vegetation models and satellite‐derived biomass products that are essential to understand vulnerable and understudied non‐forested ecosystems around the globe.more » « less
An official website of the United States government

