skip to main content


Title: Tsunami Wavefield Reconstruction and Forecasting Using the Ensemble Kalman Filter
Abstract

Offshore sensor networks like DONET and S‐NET, providing real‐time estimates of wave height through measurements of pressure changes along the seafloor, are revolutionizing local tsunami early warning. Data assimilation techniques, in particular, optimal interpolation (OI), provide real‐time wavefield reconstructions and forecasts. Here we explore an alternative assimilation method, the ensemble Kalman filter (EnKF), and compare it to OI. The methods are tested on a scenario tsunami in the Cascadia subduction zone, obtained from a 2‐D coupled dynamic earthquake and tsunami simulation. Data assimilation uses a 1‐D linear long‐wave model. We find that EnKF achieves more accurate and stable forecasts than OI, both at the coast and across the entire domain, especially for large station spacing. Although EnKF is more computationally expensive than OI, with development in high‐performance computing, it is a promising candidate for real‐time local tsunami early warning.

 
more » « less
NSF-PAR ID:
10457242
Author(s) / Creator(s):
 ;  ;  ;  
Publisher / Repository:
DOI PREFIX: 10.1029
Date Published:
Journal Name:
Geophysical Research Letters
Volume:
46
Issue:
2
ISSN:
0094-8276
Page Range / eLocation ID:
p. 853-860
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract The Prediction of Rainfall Extremes Campaign In the Pacific (PRECIP) aims to improve our understanding of extreme rainfall processes in the East Asian summer monsoon. A convection-permitting ensemble-based data assimilation and forecast system (the PSU WRF-EnKF system) was run in real time in the summers of 2020–21 in advance of the 2022 field campaign, assimilating all-sky infrared (IR) radiances from the geostationary Himawari-8 and GOES-16 satellites, and providing 48-h ensemble forecasts every day for weather briefings and discussions. This is the first time that all-sky IR data assimilation has been performed in a real-time forecast system at a convection-permitting resolution for several seasons. Compared with retrospective forecasts that exclude all-sky IR radiances, rainfall predictions are statistically significantly improved out to at least 4–6 h for the real-time forecasts, which is comparable to the time scale of improvements gained from assimilating observations from the dense ground-based Doppler weather radars. The assimilation of all-sky IR radiances also reduced the forecast errors of large-scale environments and helped to maintain a more reasonable ensemble spread compared with the counterpart experiments that did not assimilate all-sky IR radiances. The results indicate strong potential for improving routine short-term quantitative precipitation forecasts using these high-spatiotemporal-resolution satellite observations in the future. Significance Statement During the summers of 2020/21, the PSU WRF-EnKF data assimilation and forecast system was run in real time in advance of the 2022 Prediction of Rainfall Extremes Campaign In the Pacific (PRECIP), assimilating all-sky (clear-sky and cloudy) infrared radiances from geostationary satellites into a numerical weather prediction model and providing ensemble forecasts. This study presents the first-of-its-kind systematic evaluation of the impacts of assimilating all-sky infrared radiances on short-term qualitative precipitation forecasts using multiyear, multiregion, real-time ensemble forecasts. Results suggest that rainfall forecasts are improved out to at least 4–6 h with the assimilation of all-sky infrared radiances, comparable to the influence of assimilating radar observations, with benefits in forecasting large-scale environments and representing atmospheric uncertainties as well. 
    more » « less
  2. Abstract

    Although infrequent, large (Mw7.5+) earthquakes can be extremely damaging and occur on subduction and intraplate faults worldwide. Earthquake early warning (EEW) systems aim to provide advanced warning before strong shaking and tsunami onsets. These systems estimate earthquake magnitude using the early metrics of waveforms, relying on empirical scaling relationships of abundant past events. However, both the rarity and complexity of great events make it challenging to characterize them, and EEW algorithms often underpredict magnitude and the resulting hazards. Here, we propose a model, M‐LARGE, that leverages deep learning to characterize crustal deformation patterns of large earthquakes for a specific region in real‐time. We demonstrate the algorithm in the Chilean Subduction Zone by training it with more than six million different simulated rupture scenarios recorded on the Chilean GNSS network. M‐LARGE performs reliable magnitude estimation on the testing data set with an accuracy of 99%. Furthermore, the model successfully predicts the magnitude of five real Chilean earthquakes that occurred in the last 11 years. These events were damaging, large enough to be recorded by the modern high rate global navigation satellite system instrument, and provide valuable ground truth. M‐LARGE tracks the evolution of the source process and can make faster and more accurate magnitude estimation, significantly outperforming other similar EEW algorithms. This is the first demonstration of our approach. Future work toward generalization is outstanding and will include the addition of more training and testing data, interfacing with existing EEW methods, and applying the method to different tectonic settings to explore performance in these regions.

     
    more » « less
  3. Abstract

    For data assimilation to provide faithful state estimates for dynamical models, specifications of observation uncertainty need to be as accurate as possible. Innovation-based methods based on Desroziers diagnostics, are commonly used to estimate observation uncertainty, but such methods can depend greatly on the prescribed background uncertainty. For ensemble data assimilation, this uncertainty comes from statistics calculated from ensemble forecasts, which require inflation and localization to address under sampling. In this work, we use an ensemble Kalman filter (EnKF) with a low-dimensional Lorenz model to investigate the interplay between the Desroziers method and inflation. Two inflation techniques are used for this purpose: 1) a rigorously tuned fixed multiplicative scheme and 2) an adaptive state-space scheme. We document how inaccuracies in observation uncertainty affect errors in EnKF posteriors and study the combined impacts of misspecified initial observation uncertainty, sampling error, and model error on Desroziers estimates. We find that whether observation uncertainty is over- or underestimated greatly affects the stability of data assimilation and the accuracy of Desroziers estimates and that preference should be given to initial overestimates. Inline estimates of Desroziers tend to remove the dependence between ensemble spread–skill and the initially prescribed observation error. In addition, we find that the inclusion of model error introduces spurious correlations in observation uncertainty estimates. Further, we note that the adaptive inflation scheme is less robust than fixed inflation at mitigating multiple sources of error. Last, sampling error strongly exacerbates existing sources of error and greatly degrades EnKF estimates, which translates into biased Desroziers estimates of observation error covariance.

    Significance Statement

    To generate accurate predictions of various components of the Earth system, numerical models require an accurate specification of state variables at our current time. This step adopts a probabilistic consideration of our current state estimate versus information provided from environmental measurements of the true state. Various strategies exist for estimating uncertainty in observations within this framework, but are sensitive to a host of assumptions, which are investigated in this study.

     
    more » « less
  4. Abstract

    Solving the shallow water equations efficiently is critical to the study of natural hazards induced by tsunami and storm surge, since it provides more response time in an early warning system and allows more runs to be done for probabilistic assessment where thousands of runs may be required. Using adaptive mesh refinement speeds up the process by greatly reducing computational demands while accelerating the code using the graphics processing unit (GPU) does so through using faster hardware. Combining both, we present an efficient CUDA implementation of GeoClaw, an open source Godunov‐type high‐resolution finite volume numerical scheme on adaptive grids for shallow water system with varying topography. The use of adaptive mesh refinement and spherical coordinates allows modeling transoceanic tsunami simulation. Numerical experiments on the 2011 Japan tsunami and a local tsunami triggered by a hypotheticalMw 7.3 earthquake on the Seattle Fault illustrate the correctness and efficiency of the code, which implements a simplified dimensionally split version of the algorithms. Both numerical simulations are conducted on subregions on a sphere with adaptive grids that adequately resolve the propagating waves. The implementation is shown to be accurate and faster than the original when using Central Processing Units (CPUs) alone. The GPU implementation, when running on a single GPU, is observed to be 3.6 to 6.4 times faster than the original model running in parallel on a 16‐core CPU. Three metrics are proposed to evaluate relative performance of the model, which shows efficient usage of hardware resources.

     
    more » « less
  5. Abstract

    Models of bathymetry derived from satellite radar altimetry are essential for modeling many marine processes. They are affected by uncertainties which require quantification. We propose an uncertainty model that assumes errors are caused by the lack of high‐wavenumber content within the altimetry data. The model is then applied to a tsunami hazard assessment. We build a bathymetry uncertainty model for northern Chile. Statistical properties of the altimetry‐predicted bathymetry error are obtained using multibeam data. We find that a Von Karman correlation function and a Laplacian marginal distribution can be used to define an uncertainty model based on a random field. We also propose a method for generating synthetic bathymetry samples conditional to shipboard measurements. The method is further extended to account for interpolation uncertainties, when bathymetry data resolution is finer than10 km. We illustrate the usefulness of the method by quantifying the bathymetry‐induced uncertainty of a tsunami hazard estimate. We demonstrate that tsunami leading wave predictions at middle/near field tide gauges and buoys are insensitive to bathymetry uncertainties in Chile. This result implies that tsunami early warning approaches can take full advantage of altimetry‐predicted bathymetry in numerical simulations. Finally, we evaluate the feasibility of modeling uncertainties in regions without multibeam data by assessing the bathymetry error statistics of 15 globally distributed regions. We find that a general Von Karman correlation and a Laplacian marginal distribution can serve as a first‐order approximation. The standard deviation of the uncertainty random field model varies regionally and is estimated from a proposed scaling law.

     
    more » « less