This content will become publicly available on July 1, 2025
Precise monitoring of individual crop growth and health status is crucial for precision agriculture practices. However, traditional inspection methods are time-consuming, labor-intensive, prone to human error, and may not provide the comprehensive coverage required for the detailed analysis of crop variability across an entire field. This research addresses the need for efficient and high-resolution crop monitoring by leveraging Unmanned Aerial Vehicle (UAV) imagery and advanced computational techniques. The primary goal was to develop a methodology for the precise identification, extraction, and monitoring of individual corn crops throughout their growth cycle. This involved integrating UAV-derived data with image processing, computational geometry, and machine learning techniques. Bi-weekly UAV imagery was captured at altitudes of 40 m and 70 m from 30 April to 11 August, covering the entire growth cycle of the corn crop from planting to harvest. A time-series Canopy Height Model (CHM) was generated by analyzing the differences between the Digital Terrain Model (DTM) and the Digital Surface Model (DSM) derived from the UAV data. To ensure the accuracy of the elevation data, the DSM was validated against Ground Control Points (GCPs), adhering to standard practices in remote sensing data verification. Local spatial analysis and image processing techniques were employed to determine the local maximum height of each crop. Subsequently, a Voronoi data model was developed to delineate individual crop canopies, successfully identifying 13,000 out of 13,050 corn crops in the study area. To enhance accuracy in canopy size delineation, vegetation indices were incorporated into the Voronoi model segmentation, refining the initial canopy area estimates by eliminating interference from soil and shadows. The proposed methodology enables the precise estimation and monitoring of crop canopy size, height, biomass reduction, lodging, and stunted growth over time by incorporating advanced image processing techniques and integrating metrics for quantitative assessment of fields. Additionally, machine learning models were employed to determine relationships between the canopy sizes, crop height, and normalized difference vegetation index, with Polynomial Regression recording an R-squared of 11% compared to other models. This work contributes to the scientific community by demonstrating the potential of integrating UAV technology, computational geometry, and machine learning for accurate and efficient crop monitoring at the individual plant level.
more » « less- PAR ID:
- 10530683
- Publisher / Repository:
- Remote Sensing
- Date Published:
- Journal Name:
- Remote Sensing
- Volume:
- 16
- Issue:
- 14
- ISSN:
- 2072-4292
- Page Range / eLocation ID:
- 2679
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Over the last decade, the use of unmanned aerial vehicles (UAVs) for plant phenotyping and field crop monitoring has significantly evolved and expanded. These technologies have been particularly valuable for monitoring crop growth and health and for managing abiotic and biotic stresses such as drought, fertilization deficiencies, disease, and bioaggressors. This paper provides a comprehensive review of the progress in UAV‐based plant phenotyping, with a focus on the current use and application of drone technology to gain information on plant growth, development, adaptation, and yield. We reviewed over 200 research articles and discuss the best tools and methodologies for different research purposes, the challenges that need to be overcome, and the major research gaps that remain. First, the review offers a critical focus on elucidating the distinct characteristics of UAV platforms, highlighting the diverse sensor technologies employed and shedding light on the nuances of UAV data acquisition and processing methodologies. Second, it presents a comprehensive analysis of the multiple applications of UAVs in field phenotyping, underscoring the transformative potential of integrating machine learning techniques for plant analysis. Third, it delves into the realm of machine learning applications for plant phenotyping, emphasizing its role in enhancing data analysis and interpretation. Furthermore, the paper extensively examines the open issues and research challenges within the domain, addressing the complexities and limitations faced during data acquisition, processing, and interpretation. Finally, it outlines the future trends and emerging technologies in the field of UAV‐based plant phenotyping, paving the way for innovative advancements and methodologies.more » « less
-
In recent years, the utilization of machine learning algorithms and advancements in unmanned aerial vehicle (UAV) technology have caused significant shifts in remote sensing practices. In particular, the integration of machine learning with physical models and their application in UAV–satellite data fusion have emerged as two prominent approaches for the estimation of vegetation biochemistry. This study evaluates the performance of five machine learning regression algorithms (MLRAs) for the mapping of crop canopy chlorophyll at the Kellogg Biological Station (KBS) in Michigan, USA, across three scenarios: (1) application to Landsat 7, RapidEye, and PlanetScope satellite images; (2) application to UAV–satellite data fusion; and (3) integration with the PROSAIL radiative transfer model (hybrid methods PROSAIL + MLRAs). The results indicate that the majority of the five MLRAs utilized in UAV–satellite data fusion perform better than the five PROSAIL + MLRAs. The general trend suggests that the integration of satellite data with UAV-derived information, including the normalized difference red-edge index (NDRE), canopy height model, and leaf area index (LAI), significantly enhances the performance of MLRAs. The UAV–RapidEye dataset exhibits the highest coefficient of determination (R2) and the lowest root mean square errors (RMSE) when employing kernel ridge regression (KRR) and Gaussian process regression (GPR) (R2 = 0.89 and 0.89 and RMSE = 8.99 µg/cm2 and 9.65 µg/cm2, respectively). Similar performance is observed for the UAV–Landsat and UAV–PlanetScope datasets (R2 = 0.86 and 0.87 for KRR, respectively). For the hybrid models, the maximum performance is attained with the Landsat data using KRR and GPR (R2 = 0.77 and 0.51 and RMSE = 33.10 µg/cm2 and 42.91 µg/cm2, respectively), followed by R2 = 0.75 and RMSE = 39.78 µg/cm2 for the PlanetScope data upon integrating partial least squares regression (PLSR) into the hybrid model. Across all hybrid models, the RapidEye data yield the most stable performance, with the R2 ranging from 0.45 to 0.71 and RMSE ranging from 19.16 µg/cm2 to 33.07 µg/cm2. The study highlights the importance of synergizing UAV and satellite data, which enables the effective monitoring of canopy chlorophyll in small agricultural lands.
-
Abstract Understanding the interactions among agricultural processes, soil, and plants is necessary for optimizing crop yield and productivity. This study focuses on developing effective monitoring and analysis methodologies that estimate key soil and plant properties. These methodologies include data acquisition and processing approaches that use unmanned aerial vehicles (UAVs) and surface geophysical techniques. In particular, we applied these approaches to a soybean farm in Arkansas to characterize the soil–plant coupled spatial and temporal heterogeneity, as well as to identify key environmental factors that influence plant growth and yield. UAV-based multitemporal acquisition of high-resolution RGB (red–green–blue) imagery and direct measurements were used to monitor plant height and photosynthetic activity. We present an algorithm that efficiently exploits the high-resolution UAV images to estimate plant spatial abundance and plant vigor throughout the growing season. Such plant characterization is extremely important for the identification of anomalous areas, providing easily interpretable information that can be used to guide near-real-time farming decisions. Additionally, high-resolution multitemporal surface geophysical measurements of apparent soil electrical conductivity were used to estimate the spatial heterogeneity of soil texture. By integrating the multiscale multitype soil and plant datasets, we identified the spatiotemporal co-variance between soil properties and plant development and yield. Our novel approach for early season monitoring of plant spatial abundance identified areas of low productivity controlled by soil clay content, while temporal analysis of geophysical data showed the impact of soil moisture and irrigation practice (controlled by topography) on plant dynamics. Our study demonstrates the effective coupling of UAV data products with geophysical data to extract critical information for farm management.more » « less
-
null (Ed.)Timely and accurate monitoring has the potential to streamline crop management, harvest planning, and processing in the growing table beet industry of New York state. We used unmanned aerial system (UAS) combined with a multispectral imager to monitor table beet (Beta vulgaris ssp. vulgaris) canopies in New York during the 2018 and 2019 growing seasons. We assessed the optimal pairing of a reflectance band or vegetation index with canopy area to predict table beet yield components of small sample plots using leave-one-out cross-validation. The most promising models were for table beet root count and mass using imagery taken during emergence and canopy closure, respectively. We created augmented plots, composed of random combinations of the study plots, to further exploit the importance of early canopy growth area. We achieved a R2 = 0.70 and root mean squared error (RMSE) of 84 roots (~24%) for root count, using 2018 emergence imagery. The same model resulted in a RMSE of 127 roots (~35%) when tested on the unseen 2019 data. Harvested root mass was best modeled with canopy closing imagery, with a R2 = 0.89 and RMSE = 6700 kg/ha using 2018 data. We applied the model to the 2019 full-field imagery and found an average yield of 41,000 kg/ha (~40,000 kg/ha average for upstate New York). This study demonstrates the potential for table beet yield models using a combination of radiometric and canopy structure data obtained at early growth stages. Additional imagery of these early growth stages is vital to develop a robust and generalized model of table beet root yield that can handle imagery captured at slightly different growth stages between seasons.more » « less
-
null (Ed.)Phenology is a distinct marker of the impacts of climate change on ecosystems. Accordingly, monitoring the spatiotemporal patterns of vegetation phenology is important to understand the changing Earth system. A wide range of sensors have been used to monitor vegetation phenology, including digital cameras with different viewing geometries mounted on various types of platforms. Sensor perspective, view-angle, and resolution can potentially impact estimates of phenology. We compared three different methods of remotely sensing vegetation phenology—an unoccupied aerial vehicle (UAV)-based, downward-facing RGB camera, a below-canopy, upward-facing hemispherical camera with blue (B), green (G), and near-infrared (NIR) bands, and a tower-based RGB PhenoCam, positioned at an oblique angle to the canopy—to estimate spring phenological transition towards canopy closure in a mixed-species temperate forest in central Virginia, USA. Our study had two objectives: (1) to compare the above- and below-canopy inference of canopy greenness (using green chromatic coordinate and normalized difference vegetation index) and canopy structural attributes (leaf area and gap fraction) by matching below-canopy hemispherical photos with high spatial resolution (0.03 m) UAV imagery, to find the appropriate spatial coverage and resolution for comparison; (2) to compare how UAV, ground-based, and tower-based imagery performed in estimating the timing of the spring phenological transition. We found that a spatial buffer of 20 m radius for UAV imagery is most closely comparable to below-canopy imagery in this system. Sensors and platforms agree within +/− 5 days of when canopy greenness stabilizes from the spring phenophase into the growing season. We show that pairing UAV imagery with tower-based observation platforms and plot-based observations for phenological studies (e.g., long-term monitoring, existing research networks, and permanent plots) has the potential to scale plot-based forest structural measures via UAV imagery, constrain uncertainty estimates around phenophases, and more robustly assess site heterogeneity.more » « less