skip to main content

Title: Constraints and Opportunities for Detecting Land Surface Phenology in Drylands
Land surface phenology (LSP) enables global-scale tracking of ecosystem processes, but its utility is limited in drylands due to low vegetation cover and resulting low annual amplitudes of vegetation indices (VIs). Due to the importance of drylands for biodiversity, food security, and the carbon cycle, it is necessary to understand the limitations in measuring dryland dynamics. Here, using simulated data and multitemporal unmanned aerial vehicle (UAV) imagery of a desert shrubland, we explore the feasibility of detecting LSP with respect to fractional vegetation cover, plant functional types, VI uncertainty, and two different detection algorithms. Using simulated data, we found that plants with distinct VI signals, such as deciduous shrubs, can require up to 60% fractional cover to consistently detect LSP. Evergreen plants, with lower seasonal VI amplitude, require considerably higher cover and can have undetectable phenology even with 100% vegetation cover. Our evaluation of two algorithms showed that neither performed the best in all cases. Even with adequate cover, biases in phenological metrics can still exceed 20 days and can never be 100% accurate due to VI uncertainty from shadows, sensor view angle, and atmospheric interference. We showed how high-resolution UAV imagery enables LSP studies in drylands and highlighted important more » scale effects driven by within-canopy VI variation. With high-resolution imagery, the open canopies of drylands are beneficial as they allow for straightforward identification of individual plants, enabling the tracking of phenology at the individual level. Drylands thus have the potential to become an exemplary environment for future LSP research. « less
; ; ;
Award ID(s):
Publication Date:
Journal Name:
Journal of Remote Sensing
Page Range or eLocation-ID:
1 to 15
Sponsoring Org:
National Science Foundation
More Like this
  1. Phenology is a distinct marker of the impacts of climate change on ecosystems. Accordingly, monitoring the spatiotemporal patterns of vegetation phenology is important to understand the changing Earth system. A wide range of sensors have been used to monitor vegetation phenology, including digital cameras with different viewing geometries mounted on various types of platforms. Sensor perspective, view-angle, and resolution can potentially impact estimates of phenology. We compared three different methods of remotely sensing vegetation phenology—an unoccupied aerial vehicle (UAV)-based, downward-facing RGB camera, a below-canopy, upward-facing hemispherical camera with blue (B), green (G), and near-infrared (NIR) bands, and a tower-based RGB PhenoCam, positioned at an oblique angle to the canopy—to estimate spring phenological transition towards canopy closure in a mixed-species temperate forest in central Virginia, USA. Our study had two objectives: (1) to compare the above- and below-canopy inference of canopy greenness (using green chromatic coordinate and normalized difference vegetation index) and canopy structural attributes (leaf area and gap fraction) by matching below-canopy hemispherical photos with high spatial resolution (0.03 m) UAV imagery, to find the appropriate spatial coverage and resolution for comparison; (2) to compare how UAV, ground-based, and tower-based imagery performed in estimating the timing of the spring phenologicalmore »transition. We found that a spatial buffer of 20 m radius for UAV imagery is most closely comparable to below-canopy imagery in this system. Sensors and platforms agree within +/− 5 days of when canopy greenness stabilizes from the spring phenophase into the growing season. We show that pairing UAV imagery with tower-based observation platforms and plot-based observations for phenological studies (e.g., long-term monitoring, existing research networks, and permanent plots) has the potential to scale plot-based forest structural measures via UAV imagery, constrain uncertainty estimates around phenophases, and more robustly assess site heterogeneity.« less
  2. Abstract

    Vegetation phenology is a key control on water, energy, and carbon fluxes in terrestrial ecosystems. Because vegetation canopies are heterogeneous, spatially explicit information related to seasonality in vegetation activity provides valuable information for studies that use eddy covariance measurements to study ecosystem function and land-atmosphere interactions. Here we present a land surface phenology (LSP) dataset derived at 3 m spatial resolution from PlanetScope imagery across a range of plant functional types and climates in North America. The dataset provides spatially explicit information related to the timing of phenophase changes such as the start, peak, and end of vegetation activity, along with vegetation index metrics and associated quality assurance flags for the growing seasons of 2017–2021 for 10 × 10 km windows centred over 104 eddy covariance towers at AmeriFlux and National Ecological Observatory Network (NEON) sites. These LSP data can be used to analyse processes controlling the seasonality of ecosystem-scale carbon, water, and energy fluxes, to evaluate predictions from land surface models, and to assess satellite-based LSP products.

  3. Visual terrain-relative navigation (VTRN) is a localization method based on registering a source image taken from a robotic vehicle against a georeferenced target image. With high-resolution imagery databases of Earth and other planets now available, VTRN offers accurate, drift-free navigation for air and space robots even in the absence of external positioning signals. Despite its potential for high accuracy, however, VTRN remains extremely fragile to common and predictable seasonal effects, such as lighting, vegetation changes, and snow cover. Engineered registration algorithms are mature and have provable geometric advantages but cannot accommodate the content changes caused by seasonal effects and have poor matching skill. Approaches based on deep learning can accommodate image content changes but produce opaque position estimates that either lack an interpretable uncertainty or require tedious human annotation. In this work, we address these issues with targeted use of deep learning within an image transform architecture, which converts seasonal imagery to a stable, invariant domain that can be used by conventional algorithms without modification. Our transform preserves the geometric structure and uncertainty estimates of legacy approaches and demonstrates superior performance under extreme seasonal changes while also being easy to train and highly generalizable. We show that classical registration methodsmore »perform exceptionally well for robotic visual navigation when stabilized with the proposed architecture and are able to consistently anticipate reliable imagery. Gross mismatches were nearly eliminated in challenging and realistic visual navigation tasks that also included topographic and perspective effects.

    « less
  4. Unmanned aerial vehicles (UAVs) equipped with multispectral sensors offer high spatial and temporal resolution imagery for monitoring crop stress at early stages of development. Analysis of UAV-derived data with advanced machine learning models could improve real-time management in agricultural systems, but guidance for this integration is currently limited. Here we compare two deep learning-based strategies for early warning detection of crop stress, using multitemporal imagery throughout the growing season to predict field-scale yield in irrigated rice in eastern Arkansas. Both deep learning strategies showed improvements upon traditional statistical learning approaches including linear regression and gradient boosted decision trees. First, we explicitly accounted for variation across developmental stages using a 3D convolutional neural network (CNN) architecture that captures both spatial and temporal dimensions of UAV images from multiple time points throughout one growing season. 3D-CNNs achieved low prediction error on the test set, with a Root Mean Squared Error (RMSE) of 8.8% of the mean yield. For the second strategy, a 2D-CNN, we considered only spatial relationships among pixels for image features acquired during a single flyover. 2D-CNNs trained on images from a single day were most accurate when images were taken during booting stage or later, with RMSE ranging frommore »7.4 to 8.2% of the mean yield. A primary benefit of convolutional autoencoder-like models (based on analyses of prediction maps and feature importance) is the spatial denoising effect that corrects yield predictions for individual pixels based on the values of vegetation index and thermal features for nearby pixels. Our results highlight the promise of convolutional autoencoders for UAV-based yield prediction in rice.« less
  5. With rapid innovations in drone, camera, and 3D photogrammetry, drone-based remote sensing can accurately and efficiently provide ultra-high resolution imagery and digital surface model (DSM) at a landscape scale. Several studies have been conducted using drone-based remote sensing to quantitatively assess the impacts of wind erosion on the vegetation communities and landforms in drylands. In this study, first, five difficulties in conducting wind erosion research through data collection from fieldwork are summarized: insufficient samples, spatial displacement with auxiliary datasets, missing volumetric information, a unidirectional view, and spatially inexplicit input. Then, five possible applications—to provide a reliable and valid sample set, to mitigate the spatial offset, to monitor soil elevation change, to evaluate the directional property of land cover, and to make spatially explicit input for ecological models—of drone-based remote sensing products are suggested. To sum up, drone-based remote sensing has become a useful method to research wind erosion in drylands, and can solve the issues caused by using data collected from fieldwork. For wind erosion research in drylands, we suggest that a drone-based remote sensing product should be used as a complement to field measurements.