skip to main content


Title: Dynamic Probabilistic Hazard Mapping in the Long Valley Volcanic Region CA: Integrating Vent Opening Maps and Statistical Surrogates of Physical Models of Pyroclastic Density Currents
Abstract

Ideally, probabilistic hazard assessments combine available knowledge about physical mechanisms of the hazard, data on past hazards, and any precursor information. Systematically assessing the probability of rare, yet catastrophic hazards adds a layer of difficulty due to limited observation data. Via computer models, one can exercise potentially dangerous scenarios that may not have happened in the past but are probabilistically consistent with the aleatoric nature of previous volcanic behavior in the record. Traditional Monte Carlo‐based methods to calculate such hazard probabilities suffer from two issues: they are computationally expensive, and they are static. In light of new information, newly available data, signs of unrest, and new probabilistic analysis describing uncertainty about scenarios the Monte Carlo calculation would need to be redone under the same computational constraints. Here we present an alternative approach utilizing statistical emulators that provide an efficient way to overcome the computational bottleneck of typical Monte Carlo approaches. Moreover, this approach is independent of an aleatoric scenario model and yet can be applied rapidly to any scenario model making it dynamic. We present and apply this emulator‐based approach to create multiple probabilistic hazard maps for inundation of pyroclastic density currents in the Long Valley Volcanic Region. Further, we illustrate how this approach enables an exploration of the impact of epistemic uncertainties on these probabilistic hazard forecasts. Particularly, we focus on the uncertainty of vent opening models and how that uncertainty both aleatoric and epistemic impacts the resulting probabilistic hazard maps of pyroclastic density current inundation.

 
more » « less
Award ID(s):
1821338
NSF-PAR ID:
10451590
Author(s) / Creator(s):
 ;  ;  ;  ;  
Publisher / Repository:
DOI PREFIX: 10.1029
Date Published:
Journal Name:
Journal of Geophysical Research: Solid Earth
Volume:
124
Issue:
9
ISSN:
2169-9313
Page Range / eLocation ID:
p. 9600-9621
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Effective volcanic hazard management in regions where populations live in close proximity to persistent volcanic activity involves understanding the dynamic nature of hazards, and associated risk. Emphasis until now has been placed on identification and forecasting of the escalation phase of activity, in order to provide adequate warning of what might be to come. However, understanding eruption hiatus and post-eruption unrest hazards, or how to quantify residual hazard after the end of an eruption, is also important and often key to timely post-eruption recovery. Unfortunately, in many cases when the level of activity lessens, the hazards, although reduced, do not necessarily cease altogether. This is due to both the imprecise nature of determination of the “end” of an eruptive phase as well as to the possibility that post-eruption hazardous processes may continue to occur. An example of the latter is continued dome collapse hazard from lava domes which have ceased to grow, or sector collapse of parts of volcanic edifices, including lava dome complexes. We present a new probabilistic model for forecasting pyroclastic density currents (PDCs) from lava dome collapse that takes into account the heavy-tailed distribution of the lengths of eruptive phases, the periods of quiescence, and the forecast window of interest. In the hazard analysis, we also consider probabilistic scenario models describing the flow’s volume and initial direction. Further, with the use of statistical emulators, we combine these models with physics-based simulations of PDCs at Soufrière Hills Volcano to produce a series of probabilistic hazard maps for flow inundation over 5, 10, and 20 year periods. The development and application of this assessment approach is the first of its kind for the quantification of periods of diminished volcanic activity. As such, it offers evidence-based guidance for dome collapse hazards that can be used to inform decision-making around provisions of access and reoccupation in areas around volcanoes that are becoming less active over time. 
    more » « less
  2. Abstract

    Accurate delineation of compound flood hazard requires joint simulation of rainfall‐runoff and storm surges within high‐resolution flood models, which may be computationally expensive. There is a need for supplementing physical models with efficient, probabilistic methodologies for compound flood hazard assessment that can be applied under a range of climate and environment conditions. Here we propose an extension to the joint probability optimal sampling method (JPM‐OS), which has been widely used for storm surge assessment, and apply it for rainfall‐surge compound hazard assessment under climate change at the catchment‐scale. We utilize thousands of synthetic tropical cyclones (TCs) and physics‐based models to characterize storm surge and rainfall hazards at the coast. Then we implement a Bayesian quadrature optimization approach (JPM‐OS‐BQ) to select a small number (∼100) of storms, which are simulated within a high‐resolution flood model to characterize the compound flood hazard. We show that the limited JPM‐OS‐BQ simulations can capture historical flood return levels within 0.25 m compared to a high‐fidelity Monte Carlo approach. We find that the combined impact of 2100 sea‐level rise (SLR) and TC climatology changes on flood hazard change in the Cape Fear Estuary, NC will increase the 100‐year flood extent by 27% and increase inundation volume by 62%. Moreover, we show that probabilistic incorporation of SLR in the JPM‐OS‐BQ framework leads to different 100‐year flood maps compared to using a single mean SLR projection. Our framework can be applied to catchments across the United States Atlantic and Gulf coasts under a variety of climate and environment scenarios.

     
    more » « less
  3. Each year a growing number of wind farms are being added to power grids to generate sustainable energy. The power curve of a wind turbine, which exhibits the relationship between generated power and wind speed, plays a major role in assessing the performance of a wind farm. Neural networks have been used for power curve estimation. However, they do not produce a confidence measure for their output, unless computationally prohibitive Bayesian methods are used. In this paper, a probabilistic neural network with Monte Carlo dropout is considered to quantify the model or epistemic uncertainty of the power curve estimation. This approach offers a minimal increase in computational complexity and thus evaluation time. Furthermore, by adding a probabilistic loss function, the noise or aleatoric uncertainty in the data is estimated. The developed network captures both model and noise uncertainty which are found to be useful tools in assessing performance. Also, the developed network is compared with the existing ones across a public domain dataset showing superior performance in terms of prediction accuracy. The results obtained indicate that the developed network provides the quantification of uncertainty while maintaining accurate power estimation. 
    more » « less
  4. Abstract Debris flows pose a significant hazard to communities in mountainous areas, and there is a continued need for methods to delineate hazard zones associated with debris-flow inundation. In certain situations, such as scenarios following wildfire, where there could be an abrupt increase in the likelihood and size of debris flows that necessitates a rapid hazard assessment, the computational demands of inundation models play a role in their utility. The inability to efficiently determine the downstream effects of anticipated debris-flow events remains a critical gap in our ability to understand, mitigate, and assess debris-flow hazards. To better understand the downstream effects of debris flows, we introduce a computationally efficient, reduced-complexity inundation model, which we refer to as the Progressive Debris-Flow routing and inundation model (ProDF). We calibrate ProDF against mapped inundation from five watersheds near Montecito, CA, that produced debris flows shortly after the 2017 Thomas Fire. ProDF reproduced 70% of mapped deposits across a 40 km 2 study area. While this study focuses on a series of post-wildfire debris flows, ProDF is not limited to simulating debris-flow inundation following wildfire and could be applied to any scenario where it is possible to estimate a debris-flow volume. However, given its ability to reproduce mapped debris-flow deposits downstream of the 2017 Thomas Fire burn scar, and the modest run time associated with a simulation over this 40 km 2 study area, results suggest ProDF may be particularly promising for post-wildfire hazard assessment applications. 
    more » « less
  5. Probabilistic hazard assessments for studying overland pyroclastic flows or atmospheric ash clouds under short timelines of an evolving crisis, require using the best science available unhampered by complicated and slow manual workflows. Although deterministic mathematical models are available, in most cases, parameters and initial conditions for the equations are usually only known within a prescribed range of uncertainty. For the construction of probabilistic hazard assessments, accurate outputs and propagation of the inherent input uncertainty to quantities of interest are needed to estimate necessary probabilities based on numerous runs of the underlying deterministic model. Characterizing the uncertainty in system states due to parametric and input uncertainty, simultaneously, requires using ensemble based methods to explore the full parameter and input spaces. Complex tasks, such as running thousands of instances of a deterministic model with parameter and input uncertainty require a High Performance Computing infrastructure and skilled personnel that may not be readily available to the policy makers responsible for making informed risk mitigation decisions. For efficiency, programming tasks required for executing ensemble simulations need to run in parallel, leading to twin computational challenges of managing large amounts of data and performing CPU intensive processing. The resulting flow of work requires complex sequences of tasks, interactions, and exchanges of data, hence the automatic management of these workflows are essential. Here we discuss a computer infrastructure, methodology and tools which enable scientists and other members of the volcanology research community to develop workflows for construction of probabilistic hazard maps using remotely accessed computing through a web portal.

     
    more » « less