Unmanned aerial vehicles (UAVs) equipped with multispectral sensors offer high spatial and temporal resolution imagery for monitoring crop stress at early stages of development. Analysis of UAV-derived data with advanced machine learning models could improve real-time management in agricultural systems, but guidance for this integration is currently limited. Here we compare two deep learning-based strategies for early warning detection of crop stress, using multitemporal imagery throughout the growing season to predict field-scale yield in irrigated rice in eastern Arkansas. Both deep learning strategies showed improvements upon traditional statistical learning approaches including linear regression and gradient boosted decision trees. First, we explicitly accounted for variation across developmental stages using a 3D convolutional neural network (CNN) architecture that captures both spatial and temporal dimensions of UAV images from multiple time points throughout one growing season. 3D-CNNs achieved low prediction error on the test set, with a Root Mean Squared Error (RMSE) of 8.8% of the mean yield. For the second strategy, a 2D-CNN, we considered only spatial relationships among pixels for image features acquired during a single flyover. 2D-CNNs trained on images from a single day were most accurate when images were taken during booting stage or later, with RMSE ranging frommore »
Influence of soil heterogeneity on soybean plant development and crop yield evaluated using time-series of UAV and ground-based geophysical imagery
Abstract Understanding the interactions among agricultural processes, soil, and plants is necessary for optimizing crop yield and productivity. This study focuses on developing effective monitoring and analysis methodologies that estimate key soil and plant properties. These methodologies include data acquisition and processing approaches that use unmanned aerial vehicles (UAVs) and surface geophysical techniques. In particular, we applied these approaches to a soybean farm in Arkansas to characterize the soil–plant coupled spatial and temporal heterogeneity, as well as to identify key environmental factors that influence plant growth and yield. UAV-based multitemporal acquisition of high-resolution RGB (red–green–blue) imagery and direct measurements were used to monitor plant height and photosynthetic activity. We present an algorithm that efficiently exploits the high-resolution UAV images to estimate plant spatial abundance and plant vigor throughout the growing season. Such plant characterization is extremely important for the identification of anomalous areas, providing easily interpretable information that can be used to guide near-real-time farming decisions. Additionally, high-resolution multitemporal surface geophysical measurements of apparent soil electrical conductivity were used to estimate the spatial heterogeneity of soil texture. By integrating the multiscale multitype soil and plant datasets, we identified the spatiotemporal co-variance between soil properties and plant development and yield. Our novel more »
- Award ID(s):
- 1946391
- Publication Date:
- NSF-PAR ID:
- 10321608
- Journal Name:
- Scientific Reports
- Volume:
- 11
- Issue:
- 1
- ISSN:
- 2045-2322
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Coastal salt marshes are biologically productive ecosystems that generate and sequester significant quantities of organic matter. Plant biomass varies spatially within a salt marsh and it is tedious and often logistically impractical to quantify biomass from field measurements across an entire landscape. Satellite data are useful for estimating aboveground biomass, however, high-resolution data are needed to resolve the spatial details within a salt marsh. This study used 3-m resolution multispectral data provided by Planet to estimate aboveground biomass within two salt marshes, North Inlet-Winyah Bay (North Inlet) National Estuary Research Reserve, and Plum Island Ecosystems (PIE) Long-Term Ecological Research site. The Akaike information criterion analysis was performed to test the fidelity of several alternative models. A combination of the modified soil vegetation index 2 (MSAVI2) and the visible difference vegetation index (VDVI) gave the best fit to the square root-normalized biomass data collected in the field at North Inlet (Willmott’s index of agreement d = 0.74, RMSE = 223.38 g/m2, AICw = 0.3848). An acceptable model was not found among all models tested for PIE data, possibly because the sample size at PIE was too small, samples were collected over a limited vertical range, in a different season, and frommore »
-
Flooding is one of the leading threats of natural disasters to human life and property, especially in densely populated urban areas. Rapid and precise extraction of the flooded areas is key to supporting emergency-response planning and providing damage assessment in both spatial and temporal measurements. Unmanned Aerial Vehicles (UAV) technology has recently been recognized as an efficient photogrammetry data acquisition platform to quickly deliver high-resolution imagery because of its cost-effectiveness, ability to fly at lower altitudes, and ability to enter a hazardous area. Different image classification methods including SVM (Support Vector Machine) have been used for flood extent mapping. In recent years, there has been a significant improvement in remote sensing image classification using Convolutional Neural Networks (CNNs). CNNs have demonstrated excellent performance on various tasks including image classification, feature extraction, and segmentation. CNNs can learn features automatically from large datasets through the organization of multi-layers of neurons and have the ability to implement nonlinear decision functions. This study investigates the potential of CNN approaches to extract flooded areas from UAV imagery. A VGG-based fully convolutional network (FCN-16s) was used in this research. The model was fine-tuned and a k-fold cross-validation was applied to estimate the performance of the modelmore »
-
Abstract. Environmental science is increasingly reliant on remotely sensedobservations of the Earth's surface and atmosphere. Observations frompolar-orbiting satellites have long supported investigations on land coverchange, ecosystem productivity, hydrology, climate, the impacts ofdisturbance, and more and are critical for extrapolating (upscaling)ground-based measurements to larger areas. However, the limited temporalfrequency at which polar-orbiting satellites observe the Earth limits ourunderstanding of rapidly evolving ecosystem processes, especially in areaswith frequent cloud cover. Geostationary satellites have observed theEarth's surface and atmosphere at high temporal frequency for decades, andtheir imagers now have spectral resolutions in the visible and near-infrared regions that are comparable to commonly used polar-orbiting sensors like the Moderate Resolution Imaging Spectroradiometer (MODIS), Visible Infrared Imaging Radiometer Suite (VIIRS), or Landsat. These advances extend applications of geostationary Earth observations from weather monitoring to multiple disciplines in ecology and environmental science. We review a number of existing applications that use data from geostationary platforms and present upcoming opportunities for observing key ecosystem properties using high-frequency observations from the Advanced Baseline Imagers (ABI) on the Geostationary Operational Environmental Satellites (GOES), which routinely observe the Western Hemisphere every 5–15 min. Many of the existing applications in environmental science from ABI are focused on estimating land surface temperature, solarmore »
-
Phenology is a distinct marker of the impacts of climate change on ecosystems. Accordingly, monitoring the spatiotemporal patterns of vegetation phenology is important to understand the changing Earth system. A wide range of sensors have been used to monitor vegetation phenology, including digital cameras with different viewing geometries mounted on various types of platforms. Sensor perspective, view-angle, and resolution can potentially impact estimates of phenology. We compared three different methods of remotely sensing vegetation phenology—an unoccupied aerial vehicle (UAV)-based, downward-facing RGB camera, a below-canopy, upward-facing hemispherical camera with blue (B), green (G), and near-infrared (NIR) bands, and a tower-based RGB PhenoCam, positioned at an oblique angle to the canopy—to estimate spring phenological transition towards canopy closure in a mixed-species temperate forest in central Virginia, USA. Our study had two objectives: (1) to compare the above- and below-canopy inference of canopy greenness (using green chromatic coordinate and normalized difference vegetation index) and canopy structural attributes (leaf area and gap fraction) by matching below-canopy hemispherical photos with high spatial resolution (0.03 m) UAV imagery, to find the appropriate spatial coverage and resolution for comparison; (2) to compare how UAV, ground-based, and tower-based imagery performed in estimating the timing of the spring phenologicalmore »