Unmanned aerial vehicles (UAVs) equipped with multispectral sensors offer high spatial and temporal resolution imagery for monitoring crop stress at early stages of development. Analysis of UAV-derived data with advanced machine learning models could improve real-time management in agricultural systems, but guidance for this integration is currently limited. Here we compare two deep learning-based strategies for early warning detection of crop stress, using multitemporal imagery throughout the growing season to predict field-scale yield in irrigated rice in eastern Arkansas. Both deep learning strategies showed improvements upon traditional statistical learning approaches including linear regression and gradient boosted decision trees. First, we explicitly accounted for variation across developmental stages using a 3D convolutional neural network (CNN) architecture that captures both spatial and temporal dimensions of UAV images from multiple time points throughout one growing season. 3D-CNNs achieved low prediction error on the test set, with a Root Mean Squared Error (RMSE) of 8.8% of the mean yield. For the second strategy, a 2D-CNN, we considered only spatial relationships among pixels for image features acquired during a single flyover. 2D-CNNs trained on images from a single day were most accurate when images were taken during booting stage or later, with RMSE ranging from 7.4 to 8.2% of the mean yield. A primary benefit of convolutional autoencoder-like models (based on analyses of prediction maps and feature importance) is the spatial denoising effect that corrects yield predictions for individual pixels based on the values of vegetation index and thermal features for nearby pixels. Our results highlight the promise of convolutional autoencoders for UAV-based yield prediction in rice.
more »
« less
Early Detection of Wheat Yellow Rust Disease and Its Impact on Terminal Yield with Multi-Spectral UAV-Imagery
The food production system is vulnerable to diseases more than ever, and the threat is increasing in an era of climate change that creates more favorable conditions for emerging diseases. Fortunately, scientists and engineers are making great strides to introduce farming innovations to tackle the challenge. Unmanned aerial vehicle (UAV) remote sensing is among the innovations and thus is widely applied for crop health monitoring and phenotyping. This study demonstrated the versatility of aerial remote sensing in diagnosing yellow rust infection in spring wheats in a timely manner and determining an intervenable period to prevent yield loss. A small UAV equipped with an aerial multispectral sensor periodically flew over, and collected remotely sensed images of, an experimental field in Chacabuco (−34.64; −60.46), Argentina during the 2021 growing season. Post-collection images at the plot level were engaged in a thorough feature-engineering process by handcrafting disease-centric vegetation indices (VIs) from the spectral dimension, and grey-level co-occurrence matrix (GLCM) texture features from the spatial dimension. A machine learning pipeline entailing a support vector machine (SVM), random forest (RF), and multilayer perceptron (MLP) was constructed to identify locations of healthy, mild infection, and severe infection plots in the field. A custom 3-dimensional convolutional neural network (3D-CNN) relying on the feature learning mechanism was an alternative prediction method. The study found red-edge (690–740 nm) and near infrared (NIR) (740–1000 nm) as vital spectral bands for distinguishing healthy and severely infected wheats. The carotenoid reflectance index 2 (CRI2), soil-adjusted vegetation index 2 (SAVI2), and GLCM contrast texture at an optimal distance d = 5 and angular direction θ = 135° were the most correlated features. The 3D-CNN-based wheat disease monitoring performed at 60% detection accuracy as early as 40 days after sowing (DAS), when crops were tillering, increasing to 71% and 77% at the later booting and flowering stages (100–120 DAS), and reaching a peak accuracy of 79% for the spectral-spatio-temporal fused data model. The success of early disease diagnosis from low-cost multispectral UAVs not only shed new light on crop breeding and pathology but also aided crop growers by informing them of a prevention period that could potentially preserve 3–7% of the yield at the confidence level of 95%.
more »
« less
- Award ID(s):
- 2133407
- PAR ID:
- 10466254
- Date Published:
- Journal Name:
- Remote Sensing
- Volume:
- 15
- Issue:
- 13
- ISSN:
- 2072-4292
- Page Range / eLocation ID:
- 3301
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Accurate, precise, and timely estimation of crop yield is key to a grower’s ability to proactively manage crop growth and predict harvest logistics. Such yield predictions typically are based on multi-parametric models and in-situ sampling. Here we investigate the extension of a greenhouse study, to low-altitude unmanned aerial systems (UAS). Our principal objective was to investigate snap bean crop (Phaseolus vulgaris) yield using imaging spectroscopy (hyperspectral imaging) in the visible to near-infrared (VNIR; 400–1000 nm) region via UAS. We aimed to solve the problem of crop yield modelling by identifying spectral features explaining yield and evaluating the best time period for accurate yield prediction, early in time. We introduced a Python library, named Jostar, for spectral feature selection. Embedded in Jostar, we proposed a new ranking method for selected features that reaches an agreement between multiple optimization models. Moreover, we implemented a well-known denoising algorithm for the spectral data used in this study. This study benefited from two years of remotely sensed data, captured at multiple instances over the summers of 2019 and 2020, with 24 plots and 18 plots, respectively. Two harvest stage models, early and late harvest, were assessed at two different locations in upstate New York, USA. Six varieties of snap bean were quantified using two components of yield, pod weight and seed length. We used two different vegetation detection algorithms. the Red-Edge Normalized Difference Vegetation Index (RENDVI) and Spectral Angle Mapper (SAM), to subset the fields into vegetation vs. non-vegetation pixels. Partial least squares regression (PLSR) was used as the regression model. Among nine different optimization models embedded in Jostar, we selected the Genetic Algorithm (GA), Ant Colony Optimization (ACO), Simulated Annealing (SA), and Particle Swarm Optimization (PSO) and their resulting joint ranking. The findings show that pod weight can be explained with a high coefficient of determination (R2 = 0.78–0.93) and low root-mean-square error (RMSE = 940–1369 kg/ha) for two years of data. Seed length yield assessment resulted in higher accuracies (R2 = 0.83–0.98) and lower errors (RMSE = 4.245–6.018 mm). Among optimization models used, ACO and SA outperformed others and the SAM vegetation detection approach showed improved results when compared to the RENDVI approach when dense canopies were being examined. Wavelengths at 450, 500, 520, 650, 700, and 760 nm, were identified in almost all data sets and harvest stage models used. The period between 44–55 days after planting (DAP) the optimal time period for yield assessment. Future work should involve transferring the learned concepts to a multispectral system, for eventual operational use; further attention should also be paid to seed length as a ground truth data collection technique, since this yield indicator is far more rapid and straightforward.more » « less
-
One of the most important and widespread corn/maize virus diseases is maize dwarf mosaic (MDM), which can be induced by sugarcane mosaic virus (SCMV). This study explores a machine learning analysis of five-band multispectral imagery collected via an unmanned aerial system (UAS) during the 2021 and 2022 seasons for SCMV disease detection in corn fields. The three primary objectives are to (i) determine the spectral bands and vegetation indices that are most important or correlated with SCMV infection in corn, (ii) compare spectral signatures of mock-inoculated and SCMV-inoculated plants, and (iii) compare the performance of four machine learning algorithms, including ridge regression, support vector machine (SVM), random forest, and XGBoost, in predicting SCMV during early and late stages in corn. On average, SCMV-inoculated plants had higher reflectance values for blue, green, red, and red-edge bands and lower reflectance for near-infrared as compared to mock-inoculated samples. Across both years, the XGBoost regression model performed best for predicting disease incidence percentage (R2 = 0.29, RMSE = 29.26), and SVM classification performed best for the binary prediction of SCMV-inoculated vs. mock-inoculated samples (72.9% accuracy). Generally, model performances appeared to increase as the season progressed into August and September. According to Shapley additive explanations (SHAP analysis) of the top performing models, the simplified canopy chlorophyll content index (SCCCI) and saturation index (SI) were the vegetation indices that consistently had the strongest impacts on model behavior for SCMV disease regression and classification prediction. The findings of this study demonstrate the potential for the development of UAS image-based tools for farmers, aiming to facilitate the precise identification and mapping of SCMV infection in corn.more » « less
-
High resolution mapping of coastal habitats is invaluable for resource inventory, change detection, and inventory of aquaculture applications. However, coastal areas, especially the interior of mangroves, are often difficult to access. An Unmanned Aerial Vehicle (UAV), equipped with a multispectral sensor, affords an opportunity to improve upon satellite imagery for coastal management because of the very high spatial resolution, multispectral capability, and opportunity to collect real-time observations. Despite the recent and rapid development of UAV mapping applications, few articles have quantitatively compared how much improvement there is of UAV multispectral mapping methods compared to more conventional remote sensing data such as satellite imagery. The objective of this paper is to quantitatively demonstrate the improvements of a multispectral UAV mapping technique for higher resolution images used for advanced mapping and assessing coastal land cover. We performed multispectral UAV mapping fieldwork trials over Indian River Lagoon along the central Atlantic coast of Florida. Ground Control Points (GCPs) were collected to generate a rigorous geo-referenced dataset of UAV imagery and support comparison to geo-referenced satellite and aerial imagery. Multi-spectral satellite imagery (Sentinel-2) was also acquired to map land cover for the same region. NDVI and object-oriented classification methods were used for comparison between UAV and satellite mapping capabilities. Compared with aerial images acquired from Florida Department of Environmental Protection, the UAV multi-spectral mapping method used in this study provided advanced information of the physical conditions of the study area, an improved land feature delineation, and a significantly better mapping product than satellite imagery with coarser resolution. The study demonstrates a replicable UAV multi-spectral mapping method useful for study sites that lack high quality data.more » « less
-
Abstract This study uses a small unmanned aircraft system equipped with a multispectral sensor to assess various vegetation indices (VIs) for their potential to monitor iron deficiency chlorosis (IDC) in a grain sorghum (Sorghum bicolorL.) crop. IDC is a nutritional disorder that stunts a plants’ growth and causes its leaves to yellow due to an iron deficit. The objective of this project is to find the best VI to detect and monitor IDC. A series of flights were completed over the course of the growing season and processed using Structure‐from‐Motion photogrammetry to create orthorectified, multispectral reflectance maps in the red, green, red‐edge, and near‐infrared wavelengths. Ground data collection methods were used to analyze stress, chlorophyll levels, and grain yield, correlating them to the multispectral imagery for ground control and precise crop examination. The reflectance maps and soil‐removed reflectance maps were used to calculate 25 VIs whose separability was then calculated using a two‐class distance measure, determining which contained the largest separation between the pixels representing IDC and healthy vegetation. The field‐acquired data were used to conclude which VIs achieved the best results for the dataset as a whole and at each level of IDC (low, moderate, and severe). It was concluded that the MERIS terrestrial chlorophyll index, normalized difference red‐edge, and normalized green (NG) indices achieved the highest amount of separation between plants with IDC and healthy vegetation, with the NG reaching the highest levels of separability for both soil‐included and soil‐removed VIs.more » « less
An official website of the United States government

