skip to main content


Title: Multisensor Fusion of Remotely Sensed Vegetation Indices Using Space-Time Dynamic Linear Models
Abstract

High spatiotemporal resolution maps of surface vegetation from remote sensing data are desirable for vegetation and disturbance monitoring. However, due to the current limitations of imaging spectrometers, remote sensing datasets of vegetation with high temporal frequency of measurements have lower spatial resolution, and vice versa. In this research, we propose a space-time dynamic linear model to fuse high temporal frequency data (MODIS) with high spatial resolution data (Landsat) to create high spatiotemporal resolution data products of a vegetation greenness index. The model incorporates the spatial misalignment of the data and models dependence within and across land cover types with a latent multivariate Matérn process. To handle the large size of the data, we introduce a fast estimation procedure and a moving window Kalman smoother to produce a daily, 30-m resolution data product with associated uncertainty.

 
more » « less
NSF-PAR ID:
10396403
Author(s) / Creator(s):
; ;
Publisher / Repository:
Oxford University Press
Date Published:
Journal Name:
Journal of the Royal Statistical Society Series C: Applied Statistics
Volume:
70
Issue:
3
ISSN:
0035-9254
Page Range / eLocation ID:
p. 793-812
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Due to climate change and rapid urbanization, Urban Heat Island (UHI), featuring significantly higher temperature in metropolitan areas than surrounding areas, has caused negative impacts on urban communities. Temporal granularity is often limited in UHI studies based on satellite remote sensing data that typically has multi-day frequency coverage of a particular urban area. This low temporal frequency has restricted the development of models for predicting UHI. To resolve this limitation, this study has developed a cyber-based geographic information science and systems (cyberGIS) framework encompassing multiple machine learning models for predicting UHI with high-frequency urban sensor network data combined with remote sensing data focused on Chicago, Illinois, from 2018 to 2020. Enabled by rapid advances in urban sensor network technologies and high-performance computing, this framework is designed to predict UHI in Chicago with fine spatiotemporal granularity based on environmental data collected with the Array of Things (AoT) urban sensor network and Landsat-8 remote sensing imagery. Our computational experiments revealed that a random forest regression (RFR) model outperforms other models with the prediction accuracy of 0.45 degree Celsius in 2020 and 0.8 degree Celsius in 2018 and 2019 with mean absolute error as the evaluation metric. Humidity, distance to geographic center, and PM2.5concentration are identified as important factors contributing to the model performance. Furthermore, we estimate UHI in Chicago with 10-min temporal frequency and 1-km spatial resolution on the hottest day in 2018. It is demonstrated that the RFR model can accurately predict UHI at fine spatiotemporal scales with high-frequency urban sensor network data integrated with satellite remote sensing data.

     
    more » « less
  2. The movement of animals is strongly influenced by external factors in their surrounding environment such as weather, habitat types, and human land use. With advances in positioning and sensor technologies, it is now possible to capture animal locations at high spatial and temporal granularities. Likewise, modern space-based remote sensing technology provides us with an increasing access to large volumes of environmental data, some of which changes on an hourly basis. Environmental data are heterogeneous in source and format, and are usually obtained at different scales and granularities than movement data. Indeed, there remain scientific and technical challenges in developing linkages between the growing collections of animal movement data and the large repositories of heterogeneous remote sensing observations, as well as in the developments of new statistical and computational methods for the analysis of movement in its environmental context. These challenges include retrieval, indexing, efficient storage, data integration, and analytic techniques. We have developed a new system - the Environmental-Data Automated Track Annotation (Env-DATA) - that automates annotation of movement trajectories with remote-sensing environmental information, including high resolution topography, weather from global and regional reanalysis datasets, climatology, human geography, ocean currents and productivity, land use, vegetation and land surface variables, precipitation, fire, and other global datasets. The system automates the acquisition of data from open web resources of remote sensing and weather data and provides several interpolation methods from the native grid resolution and structure to a global regular grid linked with the movement tracks in space and time. Env-DATA provides an easy-to-use platform for end users that eliminates technical difficulties of the annotation processes, including data acquisition, data transformation and integration, resampling, interpolation and interpretation. The new Env-DATA system enhances Movebank (www.movebank.org), an open portal of animal tracking data. The aim is to facilitate new understanding and predictive capabilities of spatiotemporal patterns of animal movement in response to dynamic and changing environments from local to global scales. The system is already in use by scientists worldwide, and by several conservation managers, such as the consortium of federal and private institution that manage the endangered Californian Condor populations. 
    more » « less
  3. Abstract

    Understanding spatial and temporal variation in plant traits is needed to accurately predict how communities and ecosystems will respond to global change. The National Ecological Observatory Network’s (NEON’s) Airborne Observation Platform (AOP) provides hyperspectral images and associated data products at numerous field sites at 1 m spatial resolution, potentially allowing high‐resolution trait mapping. We tested the accuracy of readily available data products of NEON’s AOP, such as Leaf Area Index (LAI), Total Biomass, Ecosystem Structure (Canopy height model [CHM]), and Canopy Nitrogen, by comparing them to spatially extensive field measurements from a mesic tallgrass prairie. Correlations with AOP data products exhibited generally weak or no relationships with corresponding field measurements. The strongest relationships were between AOP LAI and ground‐measured LAI (r = 0.32) and AOP Total Biomass and ground‐measured biomass (r = 0.23). We also examined how well the full reflectance spectra (380–2,500 nm), as opposed to derived products, could predict vegetation traits using partial least‐squares regression (PLSR) models. Among all the eight traits examined, only Nitrogen had a validation of more than 0.25. For all vegetation traits, validation ranged from 0.08 to 0.29 and the range of the root mean square error of prediction (RMSEP) was 14–64%. Our results suggest that currently available AOP‐derived data products should not be used without extensive ground‐based validation. Relationships using the full reflectance spectra may be more promising, although careful consideration of field and AOP data mismatches in space and/or time, biases in field‐based measurements or AOP algorithms, and model uncertainty are needed. Finally, grassland sites may be especially challenging for airborne spectroscopy because of their high species diversity within a small area, mixed functional types of plant communities, and heterogeneous mosaics of disturbance and resource availability. Remote sensing observations are one of the most promising approaches to understanding ecological patterns across space and time. But the opportunity to engage a diverse community of NEON data users will depend on establishing rigorous links with in‐situ field measurements across a diversity of sites.

     
    more » « less
  4. Abstract

    Understanding and attributing changes to water quality is essential to the study and management of coastal ecosystems and the ecological functions they sustain (e.g., primary productivity, predation, and submerged aquatic vegetation growth). However, describing patterns of water clarity—a key aspect of water quality—over meaningful scales in space and time is challenged by high spatial and temporal variability due to natural and anthropogenic processes. Regionally tuned satellite algorithms can provide a more complete understanding of coastal water clarity changes and drivers. In this study, we used open‐access satellite data and low‐cost in situ methods to improve estimates of water clarity in an optically complex coastal water body. Specifically, we created a remote sensing water clarity product by compiling Landsat‐8 and Sentinel‐2 reflectance data with long‐term Secchi depth measurements at 12 sites over 8 years in a shallow turbid coastal lagoon system in Virginia, USA. Our satellite‐based model explained ∼33% of the variation in in situ water clarity. Our approach increases the spatiotemporal coverage of in situ water clarity data and improves estimates from bio‐optical algorithms that overpredicted water clarity. This could lead to a better understanding of water clarity changes and drivers to better predict how water quality will change in the future.

     
    more » « less
  5. Dense time-series remote sensing data with detailed spatial information are highly desired for the monitoring of dynamic earth systems. Due to the sensor tradeoff, most remote sensing systems cannot provide images with both high spatial and temporal resolutions. Spatiotemporal image fusion models provide a feasible solution to generate such a type of satellite imagery, yet existing fusion methods are limited in predicting rapid and/or transient phenological changes. Additionally, a systematic approach to assessing and understanding how varying levels of temporal phenological changes affect fusion results is lacking in spatiotemporal fusion research. The objective of this study is to develop an innovative hybrid deep learning model that can effectively and robustly fuse the satellite imagery of various spatial and temporal resolutions. The proposed model integrates two types of network models: super-resolution convolutional neural network (SRCNN) and long short-term memory (LSTM). SRCNN can enhance the coarse images by restoring degraded spatial details, while LSTM can learn and extract the temporal changing patterns from the time-series images. To systematically assess the effects of varying levels of phenological changes, we identify image phenological transition dates and design three temporal phenological change scenarios representing rapid, moderate, and minimal phenological changes. The hybrid deep learning model, alongside three benchmark fusion models, is assessed in different scenarios of phenological changes. Results indicate the hybrid deep learning model yields significantly better results when rapid or moderate phenological changes are present. It holds great potential in generating high-quality time-series datasets of both high spatial and temporal resolutions, which can further benefit terrestrial system dynamic studies. The innovative approach to understanding phenological changes’ effect will help us better comprehend the strengths and weaknesses of current and future fusion models. 
    more » « less