Abstract In the aftermath of Hurricane Ike in 2008 in the United States, the “Ike Dike” was proposed as a coastal barrier system, featuring floodgates, to protect the Houston‐Galveston area (HGA) from future storm surges. Given its substantial costs, the feasibility and effectiveness of the Ike Dike have been subjects of investigation. In this study, we evaluated these aspects under both present and future climate conditions by simulating storm surges using a set of models. Delft3D Flexible Mesh Suite was utilized to simulate hydrodynamic and wave motions driven by hurricanes, with wind and pressure fields spatialized by the Holland model. The models were validated against data from Hurricane Ike and were used to simulate synthetic hurricane tracks downscaled from several general circulation models and based on different sea level rise projections, both with and without the Ike Dike. Flood maps for each simulation were generated, and probabilistic flood depths for specific annual exceedance probabilities were predicted using annual maxima flood maps. Building damage curves were applied to residential properties in the HGA to calculate flood damage for each exceedance probability, resulting in estimates of expected annual damage as a measure of quantified flood risk. Our findings indicate that the Ike Dike significantly mitigates storm surge risk in the HGA, demonstrating its feasibility and effectiveness. We also found that the flood risk estimates are sensitive to hurricane intensity, the choice of damage curve, and the properties included in the analysis, suggesting that careful consideration is needed in future studies.
more »
« less
Bayesian Multimodal Models for Risk Analyses of Low-Probability High-Consequence Events
This paper reviews a set of Bayesian model updating methodologies for quantification of uncertainty in multi-modal models for estimating failure probabilities in rare hazard events. Specifically, a two-stage Bayesian regression model is proposed to fuse an analytical capacity model with experimentally observed capacity data to predict failure probability of residential building roof systems under severe wind loading. The ultimate goals are to construct fragility models accounting for uncertainties due to model inadequacy (epistemic uncertainty) and lack of experimental data (aleatory uncertainty) in estimating failure (exceedance) probabilities and number of damaged buildings in building portfolios. The proposed approach is illustrated on a case study involving a sample residential building portfolio under scenario hurricanes to compare the exceedance probability and aggregate expected loss to determine the most cost-effective wind mitigation options.
more »
« less
- Award ID(s):
- 2101091
- PAR ID:
- 10532501
- Editor(s):
- Gaw, N; Pardalos, PM; Gahrooei, MR
- Publisher / Repository:
- Springer Optimization and Its Applications from Springer
- Date Published:
- Journal Name:
- Multimodal and Tensor Data Analytics for Industrial Systems Improvement. Springer Optimization and Its Applications
- Edition / Version:
- Springer Optimization and Its Applications
- Volume:
- 211
- ISSN:
- 978-3031530913
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Assessing the uncertainty associated with projections of climate change impacts on hydrological processes can be challenging due to multiple sources of uncertainties within and between climate and hydrological models. Here we compare the effects of parameter uncertainty in a hydrological model to inter-model spread from climate projections on hydrological projections of urban streamflow in response to climate change. Four hourly climate model outputs from the RCP8.5 scenario were used as inputs to a distributed hydrologic model (SWMM) calibrated using a Bayesian approach to summarize uncertainty intervals for both model parameters and streamflow predictions. Continuous simulation of 100 years of streamflow generated 90 % prediction intervals for selected exceedance probabilities and flood frequencies prediction intervals from single climate models were compared to the inter climate model spread resulting from a single calibration of the SWMM model. There will be an increase in future flows with exceedance probabilities of 0.5 %-50 % and 2-year floods for all climate projections and all 21st century periods, for the modeled Ohio (USA) watershed. Floods with return periods of ≥ 5 years increase relative to the historical from mid-century (2046–2070) for most climate projections and parameter sets. Across the four climate models, the 90th percentile increase in flows and floods ranges from 17-108 % and 11–63 % respectively. Using multiple calibration parameter sets and climate projections helped capture the most likely hydrologic outcomes, as well as upper and lower bounds of future predictions. For this watershed, hydrological model parameter uncertainty was large relative to inter climate model spread, for near term moderate to high flows and for many flood frequencies. The uncertainty quantification and comparison approach developed here may be helpful in decision-making and design of engineering infrastructure in urban watersheds.more » « less
-
To help wind turbine reliability analysis with scarce field data, aeroelastic simulators can be used to generate stochastic wind turbine loads with prescribed turbulent wind conditions. However, simulating an extreme load associated with a small load exceedance probability is computationally prohibitive and extreme load estimation from crude Monte Carlo method leads to very large uncertainty. We develop adaptive algorithms based on importance sampling theory to reduce the estimation uncertainty.more » « less
-
Summary Structural failure time models are causal models for estimating the effect of time-varying treatments on a survival outcome. G-estimation and artificial censoring have been proposed for estimating the model parameters in the presence of time-dependent confounding and administrative censoring. However, most existing methods require manually pre-processing data into regularly spaced data, which may invalidate the subsequent causal analysis. Moreover, the computation and inference are challenging due to the nonsmoothness of artificial censoring. We propose a class of continuous-time structural failure time models that respects the continuous-time nature of the underlying data processes. Under a martingale condition of no unmeasured confounding, we show that the model parameters are identifiable from a potentially infinite number of estimating equations. Using the semiparametric efficiency theory, we derive the first semiparametric doubly robust estimators, which are consistent if the model for the treatment process or the failure time model, but not necessarily both, is correctly specified. Moreover, we propose using inverse probability of censoring weighting to deal with dependent censoring. In contrast to artificial censoring, our weighting strategy does not introduce nonsmoothness in estimation and ensures that resampling methods can be used for inference.more » « less
-
Reliable probability estimation is of crucial importance in many real-world applications where there is inherent (aleatoric) uncertainty. Probability-estimation models are trained on observed outcomes (e.g. whether it has rained or not, or whether a patient has died or not), because the ground-truth probabilities of the events of interest are typically unknown. The problem is therefore analogous to binary classification, with the difference that the objective is to estimate probabilities rather than predicting the specific outcome. This work investigates probability estimation from high-dimensional data using deep neural networks. There exist several methods to improve the probabilities generated by these models but they mostly focus on model (epistemic) uncertainty. For problems with inherent uncertainty, it is challenging to evaluate performance without access to ground-truth probabilities. To address this, we build a synthetic dataset to study and compare different computable metrics. We evaluate existing methods on the synthetic data as well as on three real-world probability estimation tasks, all of which involve inherent uncertainty: precipitation forecasting from radar images, predicting cancer patient survival from histopathology images, and predicting car crashes from dashcam videos. We also give a theoretical analysis of a model for high-dimensional probability estimation which reproduces several of the phenomena evinced in our experiments. Finally, we propose a new method for probability estimation using neural networks, which modifies the training process to promote output probabilities that are consistent with empirical probabilities computed from the data. The method outperforms existing approaches on most metrics on the simulated as well as real-world data.more » « less
An official website of the United States government

