Abstract Compound flooding, characterized by the co‐occurrence of multiple flood mechanisms, is a major threat to coastlines across the globe. Tropical cyclones (TCs) are responsible for many compound floods due to their storm surge and intense rainfall. Previous efforts to quantify compound flood hazard have typically adopted statistical approaches that may be unable to fully capture spatio‐temporal dynamics between rainfall‐runoff and storm surge, which ultimately impact total water levels. In contrast, we pose a physics‐driven approach that utilizes a large set of realistic TC events and a simplified physics‐based rainfall model and simulates each event within a hydrodynamic model framework. We apply our approach to investigate TC flooding in the Cape Fear River, NC. We find TC approach angle, forward speed, and intensity are relevant for compound flood potential, but rainfall rate and time lag between the centroid of rainfall and peak storm tide are the strongest predictors of compounding magnitude. Neglecting rainfall underestimates 100‐year flood depths across 28% of the floodplain, and taking the maximum of each hazard modeled separately still underestimates 16% of the floodplain. We find the main stem of the river is surge‐dominated, upstream portions of small streams and pluvial areas are rainfall dominated, but midstream portions of streams are compounding zones, and areas close to the coastline are surge dominated for lower return periods but compounding zones for high return periods (100 years). Our method links joint rainfall‐surge occurrence to actual flood impacts and demonstrates how compound flooding is distributed across coastal catchments.
more »
« less
This content will become publicly available on June 25, 2026
A cluster-based temporal attention approach for predicting cyclone-induced compound flood dynamics
Deep learning (DL) models have been used for rapid assessments of environmental phenomena like mapping compound flood hazards from cyclones. However, predicting compound flood dynamics (e.g., flood extent and inundation depth over time) is often done with physically-based models because they capture physical drivers, nonlinear interactions, and hysteresis in system behavior. Here, we show that a customized DL model can efficiently learn spatiotemporal dependencies of multiple flood events in Galveston, TX. The proposed model combines the spatial feature extraction of CNN, temporal regression of LSTM, and a novel cluster-based temporal attention approach to assimilate multimodal inputs; thus, accurately replicating compound flood dynamics of physically-based models. The DL model achieves satisfactory flood timing (±1 h), critical success index above 60 %, RMSE below 0.10 m, and nearly perfect error bias of 1. These results demonstrate the model's potential to assist in flood preparation and response efforts in vulnerable coastal regions.
more »
« less
- Award ID(s):
- 2223893
- PAR ID:
- 10640506
- Publisher / Repository:
- Elsevier
- Date Published:
- Journal Name:
- Environmental modelling software
- Volume:
- 191
- ISSN:
- 1873-6726
- Page Range / eLocation ID:
- 106499
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Accurate delineation of compound flood hazard requires joint simulation of rainfall‐runoff and storm surges within high‐resolution flood models, which may be computationally expensive. There is a need for supplementing physical models with efficient, probabilistic methodologies for compound flood hazard assessment that can be applied under a range of climate and environment conditions. Here we propose an extension to the joint probability optimal sampling method (JPM‐OS), which has been widely used for storm surge assessment, and apply it for rainfall‐surge compound hazard assessment under climate change at the catchment‐scale. We utilize thousands of synthetic tropical cyclones (TCs) and physics‐based models to characterize storm surge and rainfall hazards at the coast. Then we implement a Bayesian quadrature optimization approach (JPM‐OS‐BQ) to select a small number (∼100) of storms, which are simulated within a high‐resolution flood model to characterize the compound flood hazard. We show that the limited JPM‐OS‐BQ simulations can capture historical flood return levels within 0.25 m compared to a high‐fidelity Monte Carlo approach. We find that the combined impact of 2100 sea‐level rise (SLR) and TC climatology changes on flood hazard change in the Cape Fear Estuary, NC will increase the 100‐year flood extent by 27% and increase inundation volume by 62%. Moreover, we show that probabilistic incorporation of SLR in the JPM‐OS‐BQ framework leads to different 100‐year flood maps compared to using a single mean SLR projection. Our framework can be applied to catchments across the United States Atlantic and Gulf coasts under a variety of climate and environment scenarios.more » « less
-
Abstract. Systematic biases and coarse resolutions are major limitations ofcurrent precipitation datasets. Many deep learning (DL)-based studies havebeen conducted for precipitation bias correction and downscaling. However,it is still challenging for the current approaches to handle complexfeatures of hourly precipitation, resulting in the incapability ofreproducing small-scale features, such as extreme events. This studydeveloped a customized DL model by incorporating customized loss functions,multitask learning and physically relevant covariates to bias correct anddownscale hourly precipitation data. We designed six scenarios tosystematically evaluate the added values of weighted loss functions,multitask learning, and atmospheric covariates compared to the regular DLand statistical approaches. The models were trained and tested using theModern-era Retrospective Analysis for Research and Applications version 2(MERRA2) reanalysis and the Stage IV radar observations over the northerncoastal region of the Gulf of Mexico on an hourly time scale. We found thatall the scenarios with weighted loss functions performed notably better thanthe other scenarios with conventional loss functions and a quantilemapping-based approach at hourly, daily, and monthly time scales as well asextremes. Multitask learning showed improved performance on capturing finefeatures of extreme events and accounting for atmospheric covariates highlyimproved model performance at hourly and aggregated time scales, while theimprovement is not as large as from weighted loss functions. We show thatthe customized DL model can better downscale and bias correct hourlyprecipitation datasets and provide improved precipitation estimates at finespatial and temporal resolutions where regular DL and statistical methodsexperience challenges.more » « less
-
Abstract Flood nowcasting refers to near-future prediction of flood status as an extreme weather event unfolds to enhance situational awareness. The objective of this study was to adopt and test a novel structured deep-learning model for urban flood nowcasting by integrating physics-based and human-sensed features. We present a new computational modeling framework including an attention-based spatial–temporal graph convolution network (ASTGCN) model and different streams of data that are collected in real-time, preprocessed, and fed into the model to consider spatial and temporal information and dependencies that improve flood nowcasting. The novelty of the computational modeling framework is threefold: first, the model is capable of considering spatial and temporal dependencies in inundation propagation thanks to the spatial and temporal graph convolutional modules; second, it enables capturing the influence of heterogeneous temporal data streams that can signal flooding status, including physics-based features (e.g., rainfall intensity and water elevation) and human-sensed data (e.g., residents’ flood reports and fluctuations of human activity) on flood nowcasting. Third, its attention mechanism enables the model to direct its focus to the most influential features that vary dynamically and influence the flood nowcasting. We show the application of the modeling framework in the context of Harris County, Texas, as the study area and 2017 Hurricane Harvey as the flood event. Three categories of features are used for nowcasting the extent of flood inundation in different census tracts: (i) static features that capture spatial characteristics of various locations and influence their flood status similarity, (ii) physics-based dynamic features that capture changes in hydrodynamic variables, and (iii) heterogeneous human-sensed dynamic features that capture various aspects of residents’ activities that can provide information regarding flood status. Results indicate that the ASTGCN model provides superior performance for nowcasting of urban flood inundation at the census-tract level, with precision 0.808 and recall 0.891, which shows the model performs better compared with other state-of-the-art models. Moreover, ASTGCN model performance improves when heterogeneous dynamic features are added into the model that solely relies on physics-based features, which demonstrates the promise of using heterogenous human-sensed data for flood nowcasting. Given the results of the comparisons of the models, the proposed modeling framework has the potential to be more investigated when more data of historical events are available in order to develop a predictive tool to provide community responders with an enhanced prediction of the flood inundation during urban flood.more » « less
-
Abstract. In coastal regions, compound flooding can arise from a combination of different drivers such as storm surges, high tides, excess river discharge, and rainfall. Compound flood potential is often assessed by quantifying the dependence and joint probabilities of the flood drivers using multivariate models. However, most of these studies assume that all extreme events originate from a single population. This assumption may not be valid for regions where flooding can arise from different generation processes, e.g., tropical cyclones (TCs) and extratropical cyclones (ETCs). Here we present a flexible copula-based statistical framework to assess compound flood potential from multiple flood drivers while explicitly accounting for different storm types. The proposed framework is applied to Gloucester City, New Jersey, and St. Petersburg, Florida as case studies. Our results highlight the importance of characterizing the contributions from TCs and non-TCs separately to avoid potential underestimation of the compound flood potential. In both study regions, TCs modulate the tails of the joint distributions (events with higher return periods) while non-TC events have a strong effect on events with low to moderate joint return periods. We show that relying solely on TCs may be inadequate when estimating compound flood risk in coastal catchments that are also exposed to other storm types. We also assess the impact of non-classified storms that are neither linked to TCs or ETCs in the region (such as locally generated convective rainfall events and remotely forced storm surges). The presented study utilizes historical data and analyzes two populations, but the framework is flexible and can be extended to account for additional storm types (e.g., storms with certain tracks or other characteristics) or can be used with model output data including hindcasts or future projections.more » « less
An official website of the United States government
