skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 10:00 PM to 12:00 PM ET on Tuesday, March 25 due to maintenance. We apologize for the inconvenience.


This content will become publicly available on July 1, 2025

Title: Detection of Individual Corn Crop and Canopy Delineation from Unmanned Aerial Vehicle Imagery
Precise monitoring of individual crop growth and health status is crucial for precision agriculture practices. However, traditional inspection methods are time-consuming, labor-intensive, prone to human error, and may not provide the comprehensive coverage required for the detailed analysis of crop variability across an entire field. This research addresses the need for efficient and high-resolution crop monitoring by leveraging Unmanned Aerial Vehicle (UAV) imagery and advanced computational techniques. The primary goal was to develop a methodology for the precise identification, extraction, and monitoring of individual corn crops throughout their growth cycle. This involved integrating UAV-derived data with image processing, computational geometry, and machine learning techniques. Bi-weekly UAV imagery was captured at altitudes of 40 m and 70 m from 30 April to 11 August, covering the entire growth cycle of the corn crop from planting to harvest. A time-series Canopy Height Model (CHM) was generated by analyzing the differences between the Digital Terrain Model (DTM) and the Digital Surface Model (DSM) derived from the UAV data. To ensure the accuracy of the elevation data, the DSM was validated against Ground Control Points (GCPs), adhering to standard practices in remote sensing data verification. Local spatial analysis and image processing techniques were employed to determine the local maximum height of each crop. Subsequently, a Voronoi data model was developed to delineate individual crop canopies, successfully identifying 13,000 out of 13,050 corn crops in the study area. To enhance accuracy in canopy size delineation, vegetation indices were incorporated into the Voronoi model segmentation, refining the initial canopy area estimates by eliminating interference from soil and shadows. The proposed methodology enables the precise estimation and monitoring of crop canopy size, height, biomass reduction, lodging, and stunted growth over time by incorporating advanced image processing techniques and integrating metrics for quantitative assessment of fields. Additionally, machine learning models were employed to determine relationships between the canopy sizes, crop height, and normalized difference vegetation index, with Polynomial Regression recording an R-squared of 11% compared to other models. This work contributes to the scientific community by demonstrating the potential of integrating UAV technology, computational geometry, and machine learning for accurate and efficient crop monitoring at the individual plant level.  more » « less
Award ID(s):
1800768 1832110
PAR ID:
10530683
Author(s) / Creator(s):
;
Publisher / Repository:
Remote Sensing
Date Published:
Journal Name:
Remote Sensing
Volume:
16
Issue:
14
ISSN:
2072-4292
Page Range / eLocation ID:
2679
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Over the last decade, the use of unmanned aerial vehicles (UAVs) for plant phenotyping and field crop monitoring has significantly evolved and expanded. These technologies have been particularly valuable for monitoring crop growth and health and for managing abiotic and biotic stresses such as drought, fertilization deficiencies, disease, and bioaggressors. This paper provides a comprehensive review of the progress in UAV‐based plant phenotyping, with a focus on the current use and application of drone technology to gain information on plant growth, development, adaptation, and yield. We reviewed over 200 research articles and discuss the best tools and methodologies for different research purposes, the challenges that need to be overcome, and the major research gaps that remain. First, the review offers a critical focus on elucidating the distinct characteristics of UAV platforms, highlighting the diverse sensor technologies employed and shedding light on the nuances of UAV data acquisition and processing methodologies. Second, it presents a comprehensive analysis of the multiple applications of UAVs in field phenotyping, underscoring the transformative potential of integrating machine learning techniques for plant analysis. Third, it delves into the realm of machine learning applications for plant phenotyping, emphasizing its role in enhancing data analysis and interpretation. Furthermore, the paper extensively examines the open issues and research challenges within the domain, addressing the complexities and limitations faced during data acquisition, processing, and interpretation. Finally, it outlines the future trends and emerging technologies in the field of UAV‐based plant phenotyping, paving the way for innovative advancements and methodologies. 
    more » « less
  2. In recent years, the utilization of machine learning algorithms and advancements in unmanned aerial vehicle (UAV) technology have caused significant shifts in remote sensing practices. In particular, the integration of machine learning with physical models and their application in UAV–satellite data fusion have emerged as two prominent approaches for the estimation of vegetation biochemistry. This study evaluates the performance of five machine learning regression algorithms (MLRAs) for the mapping of crop canopy chlorophyll at the Kellogg Biological Station (KBS) in Michigan, USA, across three scenarios: (1) application to Landsat 7, RapidEye, and PlanetScope satellite images; (2) application to UAV–satellite data fusion; and (3) integration with the PROSAIL radiative transfer model (hybrid methods PROSAIL + MLRAs). The results indicate that the majority of the five MLRAs utilized in UAV–satellite data fusion perform better than the five PROSAIL + MLRAs. The general trend suggests that the integration of satellite data with UAV-derived information, including the normalized difference red-edge index (NDRE), canopy height model, and leaf area index (LAI), significantly enhances the performance of MLRAs. The UAV–RapidEye dataset exhibits the highest coefficient of determination (R2) and the lowest root mean square errors (RMSE) when employing kernel ridge regression (KRR) and Gaussian process regression (GPR) (R2 = 0.89 and 0.89 and RMSE = 8.99 µg/cm2 and 9.65 µg/cm2, respectively). Similar performance is observed for the UAV–Landsat and UAV–PlanetScope datasets (R2 = 0.86 and 0.87 for KRR, respectively). For the hybrid models, the maximum performance is attained with the Landsat data using KRR and GPR (R2 = 0.77 and 0.51 and RMSE = 33.10 µg/cm2 and 42.91 µg/cm2, respectively), followed by R2 = 0.75 and RMSE = 39.78 µg/cm2 for the PlanetScope data upon integrating partial least squares regression (PLSR) into the hybrid model. Across all hybrid models, the RapidEye data yield the most stable performance, with the R2 ranging from 0.45 to 0.71 and RMSE ranging from 19.16 µg/cm2 to 33.07 µg/cm2. The study highlights the importance of synergizing UAV and satellite data, which enables the effective monitoring of canopy chlorophyll in small agricultural lands. 
    more » « less
  3. Abstract Understanding the interactions among agricultural processes, soil, and plants is necessary for optimizing crop yield and productivity. This study focuses on developing effective monitoring and analysis methodologies that estimate key soil and plant properties. These methodologies include data acquisition and processing approaches that use unmanned aerial vehicles (UAVs) and surface geophysical techniques. In particular, we applied these approaches to a soybean farm in Arkansas to characterize the soil–plant coupled spatial and temporal heterogeneity, as well as to identify key environmental factors that influence plant growth and yield. UAV-based multitemporal acquisition of high-resolution RGB (red–green–blue) imagery and direct measurements were used to monitor plant height and photosynthetic activity. We present an algorithm that efficiently exploits the high-resolution UAV images to estimate plant spatial abundance and plant vigor throughout the growing season. Such plant characterization is extremely important for the identification of anomalous areas, providing easily interpretable information that can be used to guide near-real-time farming decisions. Additionally, high-resolution multitemporal surface geophysical measurements of apparent soil electrical conductivity were used to estimate the spatial heterogeneity of soil texture. By integrating the multiscale multitype soil and plant datasets, we identified the spatiotemporal co-variance between soil properties and plant development and yield. Our novel approach for early season monitoring of plant spatial abundance identified areas of low productivity controlled by soil clay content, while temporal analysis of geophysical data showed the impact of soil moisture and irrigation practice (controlled by topography) on plant dynamics. Our study demonstrates the effective coupling of UAV data products with geophysical data to extract critical information for farm management. 
    more » « less
  4. null (Ed.)
    Timely and accurate monitoring has the potential to streamline crop management, harvest planning, and processing in the growing table beet industry of New York state. We used unmanned aerial system (UAS) combined with a multispectral imager to monitor table beet (Beta vulgaris ssp. vulgaris) canopies in New York during the 2018 and 2019 growing seasons. We assessed the optimal pairing of a reflectance band or vegetation index with canopy area to predict table beet yield components of small sample plots using leave-one-out cross-validation. The most promising models were for table beet root count and mass using imagery taken during emergence and canopy closure, respectively. We created augmented plots, composed of random combinations of the study plots, to further exploit the importance of early canopy growth area. We achieved a R2 = 0.70 and root mean squared error (RMSE) of 84 roots (~24%) for root count, using 2018 emergence imagery. The same model resulted in a RMSE of 127 roots (~35%) when tested on the unseen 2019 data. Harvested root mass was best modeled with canopy closing imagery, with a R2 = 0.89 and RMSE = 6700 kg/ha using 2018 data. We applied the model to the 2019 full-field imagery and found an average yield of 41,000 kg/ha (~40,000 kg/ha average for upstate New York). This study demonstrates the potential for table beet yield models using a combination of radiometric and canopy structure data obtained at early growth stages. Additional imagery of these early growth stages is vital to develop a robust and generalized model of table beet root yield that can handle imagery captured at slightly different growth stages between seasons. 
    more » « less
  5. Large-scale continuous crop monitoring systems (CMS) are key to detect and manage agricultural production anomalies. Current CMS exploit meteorological and crop growth models, and satellite imagery, but have underutilized legacy sources of information such as operational crop expert surveys with long and uninterrupted records. We argue that crop expert assessments, despite their subjective and categorical nature, capture the complexities of assessing the “status” of a crop better than any model or remote sensing retrieval. This is because crop rating data naturally encapsulates the broad expert knowledge of many individual surveyors spread throughout the country, constituting a sophisticated network of “people as sensors” that provide consistent and accurate information on crop progress. We analyze data from the US Department of Agriculture (USDA) Crop Progress and Condition (CPC) survey between 1987 and 2019 for four major crops across the US, and show how to transform the original qualitative data into a continuous, probabilistic variable better suited to quantitative analysis. Although the CPC reflects the subjective perception of many surveyors at different locations, the underlying models that describe the reported crop status are statistically robust and maintain similar characteristics across different crops, exhibit long-term stability, and have nation-wide validity. We discuss the origin and interpretation of existing spatial and temporal biases in the survey data. Finally, we propose a quantitative Crop Condition Index based on the CPC survey and demonstrate how this index can be used to monitor crop status and provide earlier and more precise predictions of crop yields than official USDA forecasts released midseason. 
    more » « less