skip to main content


Title: Influence of soil heterogeneity on soybean plant development and crop yield evaluated using time-series of UAV and ground-based geophysical imagery
Abstract Understanding the interactions among agricultural processes, soil, and plants is necessary for optimizing crop yield and productivity. This study focuses on developing effective monitoring and analysis methodologies that estimate key soil and plant properties. These methodologies include data acquisition and processing approaches that use unmanned aerial vehicles (UAVs) and surface geophysical techniques. In particular, we applied these approaches to a soybean farm in Arkansas to characterize the soil–plant coupled spatial and temporal heterogeneity, as well as to identify key environmental factors that influence plant growth and yield. UAV-based multitemporal acquisition of high-resolution RGB (red–green–blue) imagery and direct measurements were used to monitor plant height and photosynthetic activity. We present an algorithm that efficiently exploits the high-resolution UAV images to estimate plant spatial abundance and plant vigor throughout the growing season. Such plant characterization is extremely important for the identification of anomalous areas, providing easily interpretable information that can be used to guide near-real-time farming decisions. Additionally, high-resolution multitemporal surface geophysical measurements of apparent soil electrical conductivity were used to estimate the spatial heterogeneity of soil texture. By integrating the multiscale multitype soil and plant datasets, we identified the spatiotemporal co-variance between soil properties and plant development and yield. Our novel approach for early season monitoring of plant spatial abundance identified areas of low productivity controlled by soil clay content, while temporal analysis of geophysical data showed the impact of soil moisture and irrigation practice (controlled by topography) on plant dynamics. Our study demonstrates the effective coupling of UAV data products with geophysical data to extract critical information for farm management.  more » « less
Award ID(s):
1946391
NSF-PAR ID:
10321608
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ;
Date Published:
Journal Name:
Scientific Reports
Volume:
11
Issue:
1
ISSN:
2045-2322
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Unmanned aerial vehicles (UAVs) equipped with multispectral sensors offer high spatial and temporal resolution imagery for monitoring crop stress at early stages of development. Analysis of UAV-derived data with advanced machine learning models could improve real-time management in agricultural systems, but guidance for this integration is currently limited. Here we compare two deep learning-based strategies for early warning detection of crop stress, using multitemporal imagery throughout the growing season to predict field-scale yield in irrigated rice in eastern Arkansas. Both deep learning strategies showed improvements upon traditional statistical learning approaches including linear regression and gradient boosted decision trees. First, we explicitly accounted for variation across developmental stages using a 3D convolutional neural network (CNN) architecture that captures both spatial and temporal dimensions of UAV images from multiple time points throughout one growing season. 3D-CNNs achieved low prediction error on the test set, with a Root Mean Squared Error (RMSE) of 8.8% of the mean yield. For the second strategy, a 2D-CNN, we considered only spatial relationships among pixels for image features acquired during a single flyover. 2D-CNNs trained on images from a single day were most accurate when images were taken during booting stage or later, with RMSE ranging from 7.4 to 8.2% of the mean yield. A primary benefit of convolutional autoencoder-like models (based on analyses of prediction maps and feature importance) is the spatial denoising effect that corrects yield predictions for individual pixels based on the values of vegetation index and thermal features for nearby pixels. Our results highlight the promise of convolutional autoencoders for UAV-based yield prediction in rice. 
    more » « less
  2. Flooding is one of the leading threats of natural disasters to human life and property, especially in densely populated urban areas. Rapid and precise extraction of the flooded areas is key to supporting emergency-response planning and providing damage assessment in both spatial and temporal measurements. Unmanned Aerial Vehicles (UAV) technology has recently been recognized as an efficient photogrammetry data acquisition platform to quickly deliver high-resolution imagery because of its cost-effectiveness, ability to fly at lower altitudes, and ability to enter a hazardous area. Different image classification methods including SVM (Support Vector Machine) have been used for flood extent mapping. In recent years, there has been a significant improvement in remote sensing image classification using Convolutional Neural Networks (CNNs). CNNs have demonstrated excellent performance on various tasks including image classification, feature extraction, and segmentation. CNNs can learn features automatically from large datasets through the organization of multi-layers of neurons and have the ability to implement nonlinear decision functions. This study investigates the potential of CNN approaches to extract flooded areas from UAV imagery. A VGG-based fully convolutional network (FCN-16s) was used in this research. The model was fine-tuned and a k-fold cross-validation was applied to estimate the performance of the model on the new UAV imagery dataset. This approach allowed FCN-16s to be trained on the datasets that contained only one hundred training samples, and resulted in a highly accurate classification. Confusion matrix was calculated to estimate the accuracy of the proposed method. The image segmentation results obtained from FCN-16s were compared from the results obtained from FCN-8s, FCN-32s and SVMs. Experimental results showed that the FCNs could extract flooded areas precisely from UAV images compared to the traditional classifiers such as SVMs. The classification accuracy achieved by FCN-16s, FCN-8s, FCN-32s, and SVM for the water class was 97.52%, 97.8%, 94.20% and 89%, respectively. 
    more » « less
  3. Abstract

    Vegetation phenology—the seasonal timing and duration of vegetative phases—is controlled by spatiotemporally variable contributions of climatic and environmental factors plus additional potential influence from human management. We used land surface phenology derived from the Advanced Very High Resolution Radiometer and climate data to examine variability in vegetation productivity and phenological dates from 1989 to 2014 in the U.S. Northwestern Plains, a region with notable spatial heterogeneity in climate, vegetation, and land use. We first analyzed interannual trends in six phenological measures as a baseline. We then demonstrated how including annual‐resolution predictors can provide more nuanced insights into measures of phenology between plant communities and across the ecoregion. Across the study area, higher annual precipitation increased both peak and season‐long productivity. In contrast, higher mean annual temperatures tended to increase peak productivity but for the majority of the study area decreased season‐long productivity. Annual precipitation and temperature had strong explanatory power for productivity‐related phenology measures but predicted date‐based measures poorly. We found that relationships between climate and phenology varied across the region and among plant communities and that factors such as recovery from disturbance and anthropogenic management also contributed in certain regions. In sum, phenological measures did not respond ubiquitously nor covary in their responses. Nonclimatic dynamics can decouple phenology from climate; therefore, analyses including only interannual trends should not assume climate alone drives patterns. For example, models of areas exhibiting greening or browning should account for climate, anthropogenic influence, and natural disturbances. Investigating multiple aspects of phenology to describe growing‐season dynamics provides a richer understanding of spatiotemporal patterns that can be used for predicting ecosystem responses to future climates and land‐use change. Such understanding allows for clearer interpretation of results for conservation, wildlife, and land management.

     
    more » « less
  4. Crop yield is related to household food security and community resilience, especially in smallholder agricultural systems. As such, it is crucial to accurately estimate within-season yield in order to provide critical information for farm management and decision making. Therefore, the primary objective of this paper is to assess the most appropriate method, indices, and growth stage for predicting the groundnut yield in smallholder agricultural systems in northern Malawi. We have estimated the yield of groundnut in two smallholder farms using the observed yield and vegetation indices (VIs), which were derived from multitemporal PlanetScope satellite data. Simple linear, multiple linear (MLR), and random forest (RF) regressions were applied for the prediction. The leave-one-out cross-validation method was used to validate the models. The results showed that (i) of the modelling approaches, the RF model using the five most important variables (RF5) was the best approach for predicting the groundnut yield, with a coefficient of determination (R2) of 0.96 and a root mean square error (RMSE) of 0.29 kg/ha, followed by the MLR model (R2 = 0.84, RMSE = 0.84 kg/ha); in addition, (ii) the best within-season stage to accurately predict groundnut yield is during the R5/beginning seed stage. The RF5 model was used to estimate the yield for four different farms. The estimated yields were compared with the total reported yields from the farms. The results revealed that the RF5 model generally accurately estimated the groundnut yields, with the margins of error ranging between 0.85% and 11%. The errors are within the post-harvest loss margins in Malawi. The results indicate that the observed yield and VIs, which were derived from open-source remote sensing data, can be applied to estimate yield in order to facilitate farming and food security planning. 
    more » « less
  5. Abstract

    Livestock agriculture accounts for ∼15% of global anthropogenic greenhouse gas (GHG) emissions. Recently, natural climate solutions (NCS) have been identified to mitigate farm‐scale GHG emissions. Nevertheless, their impacts are difficult to quantify due to farm spatial heterogeneity and effort required to measure changes in carbon stocks. Remote sensing (RS) models are difficult to parameterize for heterogeneous agricultural landscapes. Eddy covariance (EC) in combination with novel techniques that quantitatively match source area variations could help update such vegetation‐specific parameters while accounting for pronounced heterogeneity. We evaluate a plant physiological parameter, the maximum quantum yield (MQY), which is commonly used to calculate gross and net primary productivity in RS applications. RS models often rely on spatially invariable MQY, which leads to inconsistencies between RS and EC models. We evaluate if EC data improve RS models by updating crop specific MQYs to quantify agricultural GHG mitigation potentials. We assessed how farm harvest compared to annual sums of (a) RS without improvements, (b) EC results, and (c) EC‐RS models. We then estimated emissions to calculate the annual GHG balance, including mitigation through plant carbon uptake. Our results indicate that EC‐RS models significantly improved the prediction of crop yields. The EC model captures diurnal variations in carbon dynamics in contrast to RS models based on input limitations. A net zero GHG balance indicated that perennial vegetation mitigated over 60% of emissions while comprising 40% of the landscape. We conclude that the combination of RS and EC can improve the quantification of NCS in agroecosystems.

     
    more » « less