- Award ID(s):
- 2025166
- Publication Date:
- NSF-PAR ID:
- 10354498
- Journal Name:
- Journal of Remote Sensing
- Volume:
- 2021
- Page Range or eLocation-ID:
- 1 to 15
- ISSN:
- 2694-1589
- Sponsoring Org:
- National Science Foundation
More Like this
-
Phenology is a distinct marker of the impacts of climate change on ecosystems. Accordingly, monitoring the spatiotemporal patterns of vegetation phenology is important to understand the changing Earth system. A wide range of sensors have been used to monitor vegetation phenology, including digital cameras with different viewing geometries mounted on various types of platforms. Sensor perspective, view-angle, and resolution can potentially impact estimates of phenology. We compared three different methods of remotely sensing vegetation phenology—an unoccupied aerial vehicle (UAV)-based, downward-facing RGB camera, a below-canopy, upward-facing hemispherical camera with blue (B), green (G), and near-infrared (NIR) bands, and a tower-based RGB PhenoCam, positioned at an oblique angle to the canopy—to estimate spring phenological transition towards canopy closure in a mixed-species temperate forest in central Virginia, USA. Our study had two objectives: (1) to compare the above- and below-canopy inference of canopy greenness (using green chromatic coordinate and normalized difference vegetation index) and canopy structural attributes (leaf area and gap fraction) by matching below-canopy hemispherical photos with high spatial resolution (0.03 m) UAV imagery, to find the appropriate spatial coverage and resolution for comparison; (2) to compare how UAV, ground-based, and tower-based imagery performed in estimating the timing of the spring phenologicalmore »
-
Abstract Vegetation phenology is a key control on water, energy, and carbon fluxes in terrestrial ecosystems. Because vegetation canopies are heterogeneous, spatially explicit information related to seasonality in vegetation activity provides valuable information for studies that use eddy covariance measurements to study ecosystem function and land-atmosphere interactions. Here we present a land surface phenology (LSP) dataset derived at 3 m spatial resolution from PlanetScope imagery across a range of plant functional types and climates in North America. The dataset provides spatially explicit information related to the timing of phenophase changes such as the start, peak, and end of vegetation activity, along with vegetation index metrics and associated quality assurance flags for the growing seasons of 2017–2021 for 10 × 10 km windows centred over 104 eddy covariance towers at AmeriFlux and National Ecological Observatory Network (NEON) sites. These LSP data can be used to analyse processes controlling the seasonality of ecosystem-scale carbon, water, and energy fluxes, to evaluate predictions from land surface models, and to assess satellite-based LSP products.
-
Visual terrain-relative navigation (VTRN) is a localization method based on registering a source image taken from a robotic vehicle against a georeferenced target image. With high-resolution imagery databases of Earth and other planets now available, VTRN offers accurate, drift-free navigation for air and space robots even in the absence of external positioning signals. Despite its potential for high accuracy, however, VTRN remains extremely fragile to common and predictable seasonal effects, such as lighting, vegetation changes, and snow cover. Engineered registration algorithms are mature and have provable geometric advantages but cannot accommodate the content changes caused by seasonal effects and have poor matching skill. Approaches based on deep learning can accommodate image content changes but produce opaque position estimates that either lack an interpretable uncertainty or require tedious human annotation. In this work, we address these issues with targeted use of deep learning within an image transform architecture, which converts seasonal imagery to a stable, invariant domain that can be used by conventional algorithms without modification. Our transform preserves the geometric structure and uncertainty estimates of legacy approaches and demonstrates superior performance under extreme seasonal changes while also being easy to train and highly generalizable. We show that classical registration methodsmore »
-
Unmanned aerial vehicles (UAVs) equipped with multispectral sensors offer high spatial and temporal resolution imagery for monitoring crop stress at early stages of development. Analysis of UAV-derived data with advanced machine learning models could improve real-time management in agricultural systems, but guidance for this integration is currently limited. Here we compare two deep learning-based strategies for early warning detection of crop stress, using multitemporal imagery throughout the growing season to predict field-scale yield in irrigated rice in eastern Arkansas. Both deep learning strategies showed improvements upon traditional statistical learning approaches including linear regression and gradient boosted decision trees. First, we explicitly accounted for variation across developmental stages using a 3D convolutional neural network (CNN) architecture that captures both spatial and temporal dimensions of UAV images from multiple time points throughout one growing season. 3D-CNNs achieved low prediction error on the test set, with a Root Mean Squared Error (RMSE) of 8.8% of the mean yield. For the second strategy, a 2D-CNN, we considered only spatial relationships among pixels for image features acquired during a single flyover. 2D-CNNs trained on images from a single day were most accurate when images were taken during booting stage or later, with RMSE ranging frommore »
-
With rapid innovations in drone, camera, and 3D photogrammetry, drone-based remote sensing can accurately and efficiently provide ultra-high resolution imagery and digital surface model (DSM) at a landscape scale. Several studies have been conducted using drone-based remote sensing to quantitatively assess the impacts of wind erosion on the vegetation communities and landforms in drylands. In this study, first, five difficulties in conducting wind erosion research through data collection from fieldwork are summarized: insufficient samples, spatial displacement with auxiliary datasets, missing volumetric information, a unidirectional view, and spatially inexplicit input. Then, five possible applications—to provide a reliable and valid sample set, to mitigate the spatial offset, to monitor soil elevation change, to evaluate the directional property of land cover, and to make spatially explicit input for ecological models—of drone-based remote sensing products are suggested. To sum up, drone-based remote sensing has become a useful method to research wind erosion in drylands, and can solve the issues caused by using data collected from fieldwork. For wind erosion research in drylands, we suggest that a drone-based remote sensing product should be used as a complement to field measurements.