skip to main content

Title: Influence of soil heterogeneity on soybean plant development and crop yield evaluated using time-series of UAV and ground-based geophysical imagery
Abstract Understanding the interactions among agricultural processes, soil, and plants is necessary for optimizing crop yield and productivity. This study focuses on developing effective monitoring and analysis methodologies that estimate key soil and plant properties. These methodologies include data acquisition and processing approaches that use unmanned aerial vehicles (UAVs) and surface geophysical techniques. In particular, we applied these approaches to a soybean farm in Arkansas to characterize the soil–plant coupled spatial and temporal heterogeneity, as well as to identify key environmental factors that influence plant growth and yield. UAV-based multitemporal acquisition of high-resolution RGB (red–green–blue) imagery and direct measurements were used to monitor plant height and photosynthetic activity. We present an algorithm that efficiently exploits the high-resolution UAV images to estimate plant spatial abundance and plant vigor throughout the growing season. Such plant characterization is extremely important for the identification of anomalous areas, providing easily interpretable information that can be used to guide near-real-time farming decisions. Additionally, high-resolution multitemporal surface geophysical measurements of apparent soil electrical conductivity were used to estimate the spatial heterogeneity of soil texture. By integrating the multiscale multitype soil and plant datasets, we identified the spatiotemporal co-variance between soil properties and plant development and yield. Our novel more » approach for early season monitoring of plant spatial abundance identified areas of low productivity controlled by soil clay content, while temporal analysis of geophysical data showed the impact of soil moisture and irrigation practice (controlled by topography) on plant dynamics. Our study demonstrates the effective coupling of UAV data products with geophysical data to extract critical information for farm management. « less
; ; ; ; ; ; ; ; ; ; ; ;
Award ID(s):
Publication Date:
Journal Name:
Scientific Reports
Sponsoring Org:
National Science Foundation
More Like this
  1. Unmanned aerial vehicles (UAVs) equipped with multispectral sensors offer high spatial and temporal resolution imagery for monitoring crop stress at early stages of development. Analysis of UAV-derived data with advanced machine learning models could improve real-time management in agricultural systems, but guidance for this integration is currently limited. Here we compare two deep learning-based strategies for early warning detection of crop stress, using multitemporal imagery throughout the growing season to predict field-scale yield in irrigated rice in eastern Arkansas. Both deep learning strategies showed improvements upon traditional statistical learning approaches including linear regression and gradient boosted decision trees. First, we explicitly accounted for variation across developmental stages using a 3D convolutional neural network (CNN) architecture that captures both spatial and temporal dimensions of UAV images from multiple time points throughout one growing season. 3D-CNNs achieved low prediction error on the test set, with a Root Mean Squared Error (RMSE) of 8.8% of the mean yield. For the second strategy, a 2D-CNN, we considered only spatial relationships among pixels for image features acquired during a single flyover. 2D-CNNs trained on images from a single day were most accurate when images were taken during booting stage or later, with RMSE ranging frommore »7.4 to 8.2% of the mean yield. A primary benefit of convolutional autoencoder-like models (based on analyses of prediction maps and feature importance) is the spatial denoising effect that corrects yield predictions for individual pixels based on the values of vegetation index and thermal features for nearby pixels. Our results highlight the promise of convolutional autoencoders for UAV-based yield prediction in rice.« less
  2. Coastal salt marshes are biologically productive ecosystems that generate and sequester significant quantities of organic matter. Plant biomass varies spatially within a salt marsh and it is tedious and often logistically impractical to quantify biomass from field measurements across an entire landscape. Satellite data are useful for estimating aboveground biomass, however, high-resolution data are needed to resolve the spatial details within a salt marsh. This study used 3-m resolution multispectral data provided by Planet to estimate aboveground biomass within two salt marshes, North Inlet-Winyah Bay (North Inlet) National Estuary Research Reserve, and Plum Island Ecosystems (PIE) Long-Term Ecological Research site. The Akaike information criterion analysis was performed to test the fidelity of several alternative models. A combination of the modified soil vegetation index 2 (MSAVI2) and the visible difference vegetation index (VDVI) gave the best fit to the square root-normalized biomass data collected in the field at North Inlet (Willmott’s index of agreement d = 0.74, RMSE = 223.38 g/m2, AICw = 0.3848). An acceptable model was not found among all models tested for PIE data, possibly because the sample size at PIE was too small, samples were collected over a limited vertical range, in a different season, and frommore »areas with variable canopy architecture. For North Inlet, a model-derived landscape scale biomass map showed differences in biomass density among sites, years, and showed a robust relationship between elevation and biomass. The growth curve established in this study is particularly useful as an input for biogeomorphic models of marsh development. This study showed that, used in an appropriate model with calibration, Planet data are suitable for computing and mapping aboveground biomass at high resolution on a landscape scale, which is needed to better understand spatial and temporal trends in salt marsh primary production.« less
  3. Flooding is one of the leading threats of natural disasters to human life and property, especially in densely populated urban areas. Rapid and precise extraction of the flooded areas is key to supporting emergency-response planning and providing damage assessment in both spatial and temporal measurements. Unmanned Aerial Vehicles (UAV) technology has recently been recognized as an efficient photogrammetry data acquisition platform to quickly deliver high-resolution imagery because of its cost-effectiveness, ability to fly at lower altitudes, and ability to enter a hazardous area. Different image classification methods including SVM (Support Vector Machine) have been used for flood extent mapping. In recent years, there has been a significant improvement in remote sensing image classification using Convolutional Neural Networks (CNNs). CNNs have demonstrated excellent performance on various tasks including image classification, feature extraction, and segmentation. CNNs can learn features automatically from large datasets through the organization of multi-layers of neurons and have the ability to implement nonlinear decision functions. This study investigates the potential of CNN approaches to extract flooded areas from UAV imagery. A VGG-based fully convolutional network (FCN-16s) was used in this research. The model was fine-tuned and a k-fold cross-validation was applied to estimate the performance of the modelmore »on the new UAV imagery dataset. This approach allowed FCN-16s to be trained on the datasets that contained only one hundred training samples, and resulted in a highly accurate classification. Confusion matrix was calculated to estimate the accuracy of the proposed method. The image segmentation results obtained from FCN-16s were compared from the results obtained from FCN-8s, FCN-32s and SVMs. Experimental results showed that the FCNs could extract flooded areas precisely from UAV images compared to the traditional classifiers such as SVMs. The classification accuracy achieved by FCN-16s, FCN-8s, FCN-32s, and SVM for the water class was 97.52%, 97.8%, 94.20% and 89%, respectively.« less
  4. Abstract. Environmental science is increasingly reliant on remotely sensedobservations of the Earth's surface and atmosphere. Observations frompolar-orbiting satellites have long supported investigations on land coverchange, ecosystem productivity, hydrology, climate, the impacts ofdisturbance, and more and are critical for extrapolating (upscaling)ground-based measurements to larger areas. However, the limited temporalfrequency at which polar-orbiting satellites observe the Earth limits ourunderstanding of rapidly evolving ecosystem processes, especially in areaswith frequent cloud cover. Geostationary satellites have observed theEarth's surface and atmosphere at high temporal frequency for decades, andtheir imagers now have spectral resolutions in the visible and near-infrared regions that are comparable to commonly used polar-orbiting sensors like the Moderate Resolution Imaging Spectroradiometer (MODIS), Visible Infrared Imaging Radiometer Suite (VIIRS), or Landsat. These advances extend applications of geostationary Earth observations from weather monitoring to multiple disciplines in ecology and environmental science. We review a number of existing applications that use data from geostationary platforms and present upcoming opportunities for observing key ecosystem properties using high-frequency observations from the Advanced Baseline Imagers (ABI) on the Geostationary Operational Environmental Satellites (GOES), which routinely observe the Western Hemisphere every 5–15 min. Many of the existing applications in environmental science from ABI are focused on estimating land surface temperature, solarmore »radiation, evapotranspiration, and biomass burning emissions along with detecting rapid drought development and wildfire. Ongoing work in estimating vegetation properties and phenology from other geostationary platforms demonstrates the potential to expand ABI observations to estimate vegetation greenness, moisture, and productivity at a high temporal frequency across the Western Hemisphere. Finally, we present emerging opportunities to address the relatively coarseresolution of ABI observations through multisensor fusion to resolvelandscape heterogeneity and to leverage observations from ABI to study thecarbon cycle and ecosystem function at unprecedented temporal frequency.« less
  5. Phenology is a distinct marker of the impacts of climate change on ecosystems. Accordingly, monitoring the spatiotemporal patterns of vegetation phenology is important to understand the changing Earth system. A wide range of sensors have been used to monitor vegetation phenology, including digital cameras with different viewing geometries mounted on various types of platforms. Sensor perspective, view-angle, and resolution can potentially impact estimates of phenology. We compared three different methods of remotely sensing vegetation phenology—an unoccupied aerial vehicle (UAV)-based, downward-facing RGB camera, a below-canopy, upward-facing hemispherical camera with blue (B), green (G), and near-infrared (NIR) bands, and a tower-based RGB PhenoCam, positioned at an oblique angle to the canopy—to estimate spring phenological transition towards canopy closure in a mixed-species temperate forest in central Virginia, USA. Our study had two objectives: (1) to compare the above- and below-canopy inference of canopy greenness (using green chromatic coordinate and normalized difference vegetation index) and canopy structural attributes (leaf area and gap fraction) by matching below-canopy hemispherical photos with high spatial resolution (0.03 m) UAV imagery, to find the appropriate spatial coverage and resolution for comparison; (2) to compare how UAV, ground-based, and tower-based imagery performed in estimating the timing of the spring phenologicalmore »transition. We found that a spatial buffer of 20 m radius for UAV imagery is most closely comparable to below-canopy imagery in this system. Sensors and platforms agree within +/− 5 days of when canopy greenness stabilizes from the spring phenophase into the growing season. We show that pairing UAV imagery with tower-based observation platforms and plot-based observations for phenological studies (e.g., long-term monitoring, existing research networks, and permanent plots) has the potential to scale plot-based forest structural measures via UAV imagery, constrain uncertainty estimates around phenophases, and more robustly assess site heterogeneity.« less