skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Reconstructing Digital Terrain Models from ArcticDEM and WorldView-2 Imagery in Livengood, Alaska
ArcticDEM provides the public with an unprecedented opportunity to access very high-spatial resolution digital elevation models (DEMs) covering the pan-Arctic surfaces. As it is generated from stereo-pairs of optical satellite imagery, ArcticDEM represents a mixture of a digital surface model (DSM) over a non-ground areas and digital terrain model (DTM) at bare grounds. Reconstructing DTM from ArcticDEM is thus needed in studies requiring bare ground elevation, such as modeling hydrological processes, tracking surface change dynamics, and estimating vegetation canopy height and associated forest attributes. Here we proposed an automated approach for estimating DTM from ArcticDEM in two steps: (1) identifying ground pixels from WorldView-2 imagery using a Gaussian mixture model (GMM) with local refinement by morphological operation, and (2) generating a continuous DTM surface using ArcticDEMs at ground locations and spatial interpolation methods (ordinary kriging (OK) and natural neighbor (NN)). We evaluated our method at three forested study sites characterized by different canopy cover and topographic conditions in Livengood, Alaska, where airborne lidar data is available for validation. Our results demonstrate that (1) the proposed ground identification method can effectively identify ground pixels with much lower root mean square errors (RMSEs) (<0.35 m) to the reference data than the comparative state-of-the-art approaches; (2) NN performs more robustly in DTM interpolation than OK; (3) the DTMs generated from NN interpolation with GMM-based ground masks decrease the RMSEs of ArcticDEM to 0.648 m, 1.677 m, and 0.521 m for Site-1, Site-2, and Site-3, respectively. This study provides a viable means of deriving high-resolution DTM from ArcticDEM that will be of great value to studies focusing on the Arctic ecosystems, forest change dynamics, and earth surface processes.  more » « less
Award ID(s):
1724786
PAR ID:
10421190
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Remote Sensing
Volume:
15
Issue:
8
ISSN:
2072-4292
Page Range / eLocation ID:
2061
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Topographical changes are of fundamental interest to a wide range of Arctic science disciplines faced with the need to anticipate, monitor, and respond to the effects of climate change, including geohazard management, glaciology, hydrology, permafrost, and ecology. This study demonstrates several geomorphological, cryo- spheric, and biophysical applications of ArcticDEM – a large collection of publicly available, time-dependent digital elevation models (DEMs) of the Arctic. Our study illustrates ArcticDEM’s applicability across different disciplines and five orders of magnitude of elevation derivatives, including measuring volcanic lava flows, ice cauldrons, post-failure landslides, retrogressive thaw slumps, snowdrifts, and tundra vegetation heights. We quantified surface elevation changes in different geological settings and conditions using the time series of ArcticDEM. Following the 2014–2015 B´arðarbunga eruption in Iceland, ArcticDEM analysis mapped the lava flow field, and revealed the post-eruptive ice flows and ice cauldron dynamics. The total dense-rock equivalent (DRE) volume of lava flows is estimated to be (1431 ± 2) million m3. Then, we present the aftermath of a landslide in Kinnikinnick, Alaska, yielding a total landslide volume of (400 ± 8) × 103 m3 and a total area of 0.025 km2. ArcticDEM is further proven useful for studying retrogressive thaw slumps (RTS). The ArcticDEM-mapped RTS profile is validated by ICESat-2 and drone photogrammetry resulting in a standard deviation of 0.5 m. Volume estimates for lake-side and hillslope RTSs range between 40,000 ± 9000 m3 and 1,160,000 ± 85,000 m3, highlighting applicability across a range of RTS magnitudes. A case study for mapping tundra snow demonstrates ArcticDEM’s potential for identifying high-accumulation, late-lying snow areas. The approach proves effective in quantifying relative snow accumulation rather than absolute values (standard deviation of 0.25 m, bias of 0.41 m, and a correlation coefficient of 0.69 with snow depth estimated by unmanned aerial systems photogrammetry). Furthermore, ArcticDEM data show its feasibility for estimating tundra vegetation heights with a standard deviation of 0.3 m (no bias) and a correlation up to 0.8 compared to the light detection and ranging (LiDAR). The demonstrated capabilities of ArcticDEM will pave the way for the broad and pan-Arctic use of this new data source for many disciplines, especially when combined with other imagery products. The wide range of signals embedded in ArcticDEM underscores the potential challenges in deciphering signals in regions affected by various geological processes and environmental influences. 
    more » « less
  2. null (Ed.)
    Phenology is a distinct marker of the impacts of climate change on ecosystems. Accordingly, monitoring the spatiotemporal patterns of vegetation phenology is important to understand the changing Earth system. A wide range of sensors have been used to monitor vegetation phenology, including digital cameras with different viewing geometries mounted on various types of platforms. Sensor perspective, view-angle, and resolution can potentially impact estimates of phenology. We compared three different methods of remotely sensing vegetation phenology—an unoccupied aerial vehicle (UAV)-based, downward-facing RGB camera, a below-canopy, upward-facing hemispherical camera with blue (B), green (G), and near-infrared (NIR) bands, and a tower-based RGB PhenoCam, positioned at an oblique angle to the canopy—to estimate spring phenological transition towards canopy closure in a mixed-species temperate forest in central Virginia, USA. Our study had two objectives: (1) to compare the above- and below-canopy inference of canopy greenness (using green chromatic coordinate and normalized difference vegetation index) and canopy structural attributes (leaf area and gap fraction) by matching below-canopy hemispherical photos with high spatial resolution (0.03 m) UAV imagery, to find the appropriate spatial coverage and resolution for comparison; (2) to compare how UAV, ground-based, and tower-based imagery performed in estimating the timing of the spring phenological transition. We found that a spatial buffer of 20 m radius for UAV imagery is most closely comparable to below-canopy imagery in this system. Sensors and platforms agree within +/− 5 days of when canopy greenness stabilizes from the spring phenophase into the growing season. We show that pairing UAV imagery with tower-based observation platforms and plot-based observations for phenological studies (e.g., long-term monitoring, existing research networks, and permanent plots) has the potential to scale plot-based forest structural measures via UAV imagery, constrain uncertainty estimates around phenophases, and more robustly assess site heterogeneity. 
    more » « less
  3. This dataset contains orthomosaics, digital surface models (DSMs), and multispectral image composites for nine Arctic Beaver Observation Network (ABON) sites surveyed in August 2024. The data were collected to support research on the impacts of beaver engineering on tundra hydrology, vegetation, and permafrost dynamics across Arctic Alaska. Drone-based imagery was acquired using a DJI Mavic 3 Multispectral quadcopter equipped with a DJI D-RTK 2 Mobile Base Station for real-time kinematic (RTK) positioning. At each site, flight missions were conducted at 120 meters (m) above ground level with 80% along-track and 70% across-track overlap, using a nadir-oriented camera (90°) and the hover-and-capture-at-point mode. The resulting products include: (1) (Red, Green, Blue) RGB orthomosaics with a ground sampling distance of 5 centimeters (cm), (2) Digital Surface Models (DSMs) with 10 cm spatial resolution, and (3) multispectral composites (green, red, red edge, near-infrared bands) at 10 cm resolution. Radiometric calibration was performed using images of a MicaSense calibrated reflectance panel, and a Leica Viva differential global positioning system (GPS) provided ground control for the mission and the data were post-processed to WGS84 UTM Zone 3 North. All images were processed in Pix4D Mapper (v. 4.10.0). Elevation information derived over waterbodies is noisy and does not represent the surface elevation of the feature. These high-resolution datasets provide baseline observations of beaver pond morphology and vegetation dynamics, enabling long-term monitoring of ecosystem changes driven by beaver activity in Arctic tundra landscapes. 
    more » « less
  4. The Arctic Beaver Observation Network (A-BON): Tracking a new disturbance regime project observes beaver engineering across circumarctic treeline and tundra environments during the last half-century by mapping and tracking beaver ponds using remote sensing imagery. Drones are being used to collect baseline data and track beaver dam building and pond evolution over time. This dataset consists of an orthomosaic and digital surface model (DSM) derived from drone surveys on 07 August 2023 at the Kotzebue B East site site on the Baldwin Peninsula, Alaska. 374 digital images were acquired from a DJI Phantom 4 Real-Time Kinematic (DJI P4RTK) quadcopter with a DJI D-RTK 2 Mobile Base Station. The mapped area was around 63 hectares (ha). The drone system was flown at 120 meters (m) above ground level (agl) and flight speeds varied from 8-9 meters/second (m/s). The orientation of the camera was set to 90 degrees (i.e. looking straight down). The along-track overlap and across-track overlap of the mission were set at 80% and 70%, respectively. All images were processed in the software Pix4D Mapper (v. 4.9.0) using the standard 3D Maps workflow and the accurate geolocation and orientation calibration method to produce the orthophoto mosaic and digital surface model at spatial resolutions of 5 and 10 centimeters (cm), respectively. Elevation information derived over waterbodies is noisy and does not represent the surface elevation of the feature. A Leica Viva differential global positioning system (GPS) provided ground control for the mission and the data were post-processed to WGS84 UTM Zone 3 North in Ellipsoid Heights (meters). 
    more » « less
  5. Precise monitoring of individual crop growth and health status is crucial for precision agriculture practices. However, traditional inspection methods are time-consuming, labor-intensive, prone to human error, and may not provide the comprehensive coverage required for the detailed analysis of crop variability across an entire field. This research addresses the need for efficient and high-resolution crop monitoring by leveraging Unmanned Aerial Vehicle (UAV) imagery and advanced computational techniques. The primary goal was to develop a methodology for the precise identification, extraction, and monitoring of individual corn crops throughout their growth cycle. This involved integrating UAV-derived data with image processing, computational geometry, and machine learning techniques. Bi-weekly UAV imagery was captured at altitudes of 40 m and 70 m from 30 April to 11 August, covering the entire growth cycle of the corn crop from planting to harvest. A time-series Canopy Height Model (CHM) was generated by analyzing the differences between the Digital Terrain Model (DTM) and the Digital Surface Model (DSM) derived from the UAV data. To ensure the accuracy of the elevation data, the DSM was validated against Ground Control Points (GCPs), adhering to standard practices in remote sensing data verification. Local spatial analysis and image processing techniques were employed to determine the local maximum height of each crop. Subsequently, a Voronoi data model was developed to delineate individual crop canopies, successfully identifying 13,000 out of 13,050 corn crops in the study area. To enhance accuracy in canopy size delineation, vegetation indices were incorporated into the Voronoi model segmentation, refining the initial canopy area estimates by eliminating interference from soil and shadows. The proposed methodology enables the precise estimation and monitoring of crop canopy size, height, biomass reduction, lodging, and stunted growth over time by incorporating advanced image processing techniques and integrating metrics for quantitative assessment of fields. Additionally, machine learning models were employed to determine relationships between the canopy sizes, crop height, and normalized difference vegetation index, with Polynomial Regression recording an R-squared of 11% compared to other models. This work contributes to the scientific community by demonstrating the potential of integrating UAV technology, computational geometry, and machine learning for accurate and efficient crop monitoring at the individual plant level. 
    more » « less