skip to main content


Title: Breaking Down the Computational Barriers to Real‐Time Urban Flood Forecasting
Abstract

Flooding impacts are on the rise globally, and concentrated in urban areas. Currently, there are no operational systems to forecast flooding at spatial resolutions that can facilitate emergency preparedness and response actions mitigating flood impacts. We present a framework for real‐time flood modeling and uncertainty quantification that combines the physics of fluid motion with advances in probabilistic methods. The framework overcomes the prohibitive computational demands of high‐fidelity modeling in real‐time by using a probabilistic learning method relying on surrogate models that are trained prior to a flood event. This shifts the overwhelming burden of computation to the trivial problem of data storage, and enables forecasting of both flood hazard and its uncertainty at scales that are vital for time‐critical decision‐making before and during extreme events. The framework has the potential to improve flood prediction and analysis and can be extended to other hazard assessments requiring intense high‐fidelity computations in real‐time.

 
more » « less
NSF-PAR ID:
10366666
Author(s) / Creator(s):
 ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  more » ;   « less
Publisher / Repository:
DOI PREFIX: 10.1029
Date Published:
Journal Name:
Geophysical Research Letters
Volume:
48
Issue:
20
ISSN:
0094-8276
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Abstract Many existing models that predict landslide hazards utilize ground-based sources of precipitation data. In locations where ground-based precipitation observations are limited (i.e., a vast majority of the globe), or for landslide hazard models that assess regional or global domains, satellite multisensor precipitation products offer a promising near-real-time alternative to ground-based data. NASA’s global Landslide Hazard Assessment for Situational Awareness (LHASA) model uses the Integrated Multisatellite Retrievals for Global Precipitation Measurement (IMERG) product to issue hazard “nowcasts” in near–real time for areas that are currently at risk for landsliding. Satellite-based precipitation estimates, however, can contain considerable systematic bias and random error, especially over mountainous terrain and during extreme rainfall events. This study combines a precipitation error modeling framework with a probabilistic adaptation of LHASA. Compared with the routine version of LHASA, this probabilistic version correctly predicts more of the observed landslides in the study region with fewer false alarms by high hazard nowcasts. This study demonstrates that improvements in landslide hazard prediction can be achieved regardless of whether the IMERG error model is trained using abundant ground-based precipitation observations or using far fewer and more scattered observations, suggesting that the approach is viable in data-limited regions. Results emphasize the importance of accounting for both random error and systematic satellite precipitation bias. The approach provides an example of how environmental prediction models can incorporate satellite precipitation uncertainty. Other applications such as flood and drought monitoring and forecasting could likely benefit from consideration of precipitation uncertainty. 
    more » « less
  2. Abstract

    Exposure to sea-level rise (SLR) and flooding will make some areas uninhabitable, and the increased demand for housing in safer areas may cause displacement through economic pressures. Anticipating such direct and indirect impacts of SLR is important for equitable adaptation policies. Here we build upon recent advances in flood exposure modeling and social vulnerability assessment to demonstrate a framework for estimating the direct and indirect impacts of SLR on mobility. Using two spatially distributed indicators of vulnerability and exposure, four specific modes of climate mobility are characterized: (1) minimally exposed to SLR (Stable), (2) directly exposed to SLR with capacity to relocate (Migrating), (3) indirectly exposed to SLR through economic pressures (Displaced), and (4) directly exposed to SLR without capacity to relocate (Trapped). We explore these dynamics within Miami-Dade County, USA, a metropolitan region with substantial social inequality and SLR exposure. Social vulnerability is estimated by cluster analysis using 13 social indicators at the census tract scale. Exposure is estimated under increasing SLR using a 1.5 m resolution compound flood hazard model accounting for inundation from high tides and rising groundwater and flooding from extreme precipitation and storm surge. Social vulnerability and exposure are intersected at the scale of residential buildings where exposed population is estimated by dasymetric methods. Under 1 m SLR, 56% of residents in areas of low flood hazard may experience displacement, whereas 26% of the population risks being trapped (19%) in or migrating (7%) from areas of high flood hazard, and concerns of depopulation and fiscal stress increase within at least 9 municipalities where 50% or more of their total population is exposed to flooding. As SLR increases from 1 to 2 m, the dominant flood driver shifts from precipitation to inundation, with population exposed to inundation rising from 2.8% to 54.7%. Understanding shifting geographies of flood risks and the potential for different modes of climate mobility can enable adaptation planning across household-to-regional scales.

     
    more » « less
  3. Storm surge flooding caused by tropical cyclones is a devastating threat to coastal regions, and this threat is growing due to sea-level rise (SLR). Therefore, accurate and rapid projection of the storm surge hazard is critical for coastal communities. This study focuses on developing a new framework that can rapidly predict storm surges under SLR scenarios for any random synthetic storms of interest and assign a probability to its likelihood. The framework leverages the Joint Probability Method with Response Surfaces (JPM-RS) for probabilistic hazard characterization, a storm surge machine learning model, and a SLR model. The JPM probabilities are based on historical tropical cyclone track observations. The storm surge machine learning model was trained based on high-fidelity storm surge simulations provided by the U.S. Army Corps of Engineers (USACE). The SLR was considered by adding the product of the normalized nonlinearity, arising from surge-SLR interaction, and the sea-level change from 1992 to the target year, where nonlinearities are based on high-fidelity storm surge simulations and subsequent analysis by USACE. In this study, this framework was applied to the Chesapeake Bay region of the U.S. and used to estimate the SLR-adjusted probabilistic tropical cyclone flood hazard in two areas: One is an urban Virginia site, and the other is a rural Maryland site. This new framework has the potential to aid in reducing future coastal storm risks in coastal communities by providing robust and rapid hazard assessment that accounts for future sea-level rise. 
    more » « less
  4. Abstract

    Accurate delineation of compound flood hazard requires joint simulation of rainfall‐runoff and storm surges within high‐resolution flood models, which may be computationally expensive. There is a need for supplementing physical models with efficient, probabilistic methodologies for compound flood hazard assessment that can be applied under a range of climate and environment conditions. Here we propose an extension to the joint probability optimal sampling method (JPM‐OS), which has been widely used for storm surge assessment, and apply it for rainfall‐surge compound hazard assessment under climate change at the catchment‐scale. We utilize thousands of synthetic tropical cyclones (TCs) and physics‐based models to characterize storm surge and rainfall hazards at the coast. Then we implement a Bayesian quadrature optimization approach (JPM‐OS‐BQ) to select a small number (∼100) of storms, which are simulated within a high‐resolution flood model to characterize the compound flood hazard. We show that the limited JPM‐OS‐BQ simulations can capture historical flood return levels within 0.25 m compared to a high‐fidelity Monte Carlo approach. We find that the combined impact of 2100 sea‐level rise (SLR) and TC climatology changes on flood hazard change in the Cape Fear Estuary, NC will increase the 100‐year flood extent by 27% and increase inundation volume by 62%. Moreover, we show that probabilistic incorporation of SLR in the JPM‐OS‐BQ framework leads to different 100‐year flood maps compared to using a single mean SLR projection. Our framework can be applied to catchments across the United States Atlantic and Gulf coasts under a variety of climate and environment scenarios.

     
    more » « less
  5. Abstract

    A novel modeling framework that simultaneously improves accuracy, predictability, and computational efficiency is presented. It embraces the benefits of three modeling techniques integrated together for the first time: surrogate modeling, parameter inference, and data assimilation. The use of polynomial chaos expansion (PCE) surrogates significantly decreases computational time. Parameter inference allows for model faster convergence, reduced uncertainty, and superior accuracy of simulated results. Ensemble Kalman filters assimilate errors that occur during forecasting. To examine the applicability and effectiveness of the integrated framework, we developed 18 approaches according to how surrogate models are constructed, what type of parameter distributions are used as model inputs, and whether model parameters are updated during the data assimilation procedure. We conclude that (1) PCE must be built over various forcing and flow conditions, and in contrast to previous studies, it does not need to be rebuilt at each time step; (2) model parameter specification that relies on constrained, posterior information of parameters (so‐calledSelectedspecification) can significantly improve forecasting performance and reduce uncertainty bounds compared toRandomspecification using prior information of parameters; and (3) no substantial differences in results exist between single and dual ensemble Kalman filters, but the latter better simulates flood peaks. The use of PCE effectively compensates for the computational load added by the parameter inference and data assimilation (up to ~80 times faster). Therefore, the presented approach contributes to a shift in modeling paradigm arguing that complex, high‐fidelity hydrologic and hydraulic models should be increasingly adopted for real‐time and ensemble flood forecasting.

     
    more » « less