skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Dynamic Probabilistic Hazard Mapping in the Long Valley Volcanic Region CA: Integrating Vent Opening Maps and Statistical Surrogates of Physical Models of Pyroclastic Density Currents
Abstract Ideally, probabilistic hazard assessments combine available knowledge about physical mechanisms of the hazard, data on past hazards, and any precursor information. Systematically assessing the probability of rare, yet catastrophic hazards adds a layer of difficulty due to limited observation data. Via computer models, one can exercise potentially dangerous scenarios that may not have happened in the past but are probabilistically consistent with the aleatoric nature of previous volcanic behavior in the record. Traditional Monte Carlo‐based methods to calculate such hazard probabilities suffer from two issues: they are computationally expensive, and they are static. In light of new information, newly available data, signs of unrest, and new probabilistic analysis describing uncertainty about scenarios the Monte Carlo calculation would need to be redone under the same computational constraints. Here we present an alternative approach utilizing statistical emulators that provide an efficient way to overcome the computational bottleneck of typical Monte Carlo approaches. Moreover, this approach is independent of an aleatoric scenario model and yet can be applied rapidly to any scenario model making it dynamic. We present and apply this emulator‐based approach to create multiple probabilistic hazard maps for inundation of pyroclastic density currents in the Long Valley Volcanic Region. Further, we illustrate how this approach enables an exploration of the impact of epistemic uncertainties on these probabilistic hazard forecasts. Particularly, we focus on the uncertainty of vent opening models and how that uncertainty both aleatoric and epistemic impacts the resulting probabilistic hazard maps of pyroclastic density current inundation.  more » « less
Award ID(s):
1821338
PAR ID:
10451590
Author(s) / Creator(s):
 ;  ;  ;  ;  
Publisher / Repository:
DOI PREFIX: 10.1029
Date Published:
Journal Name:
Journal of Geophysical Research: Solid Earth
Volume:
124
Issue:
9
ISSN:
2169-9313
Page Range / eLocation ID:
p. 9600-9621
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Effective volcanic hazard management in regions where populations live in close proximity to persistent volcanic activity involves understanding the dynamic nature of hazards, and associated risk. Emphasis until now has been placed on identification and forecasting of the escalation phase of activity, in order to provide adequate warning of what might be to come. However, understanding eruption hiatus and post-eruption unrest hazards, or how to quantify residual hazard after the end of an eruption, is also important and often key to timely post-eruption recovery. Unfortunately, in many cases when the level of activity lessens, the hazards, although reduced, do not necessarily cease altogether. This is due to both the imprecise nature of determination of the “end” of an eruptive phase as well as to the possibility that post-eruption hazardous processes may continue to occur. An example of the latter is continued dome collapse hazard from lava domes which have ceased to grow, or sector collapse of parts of volcanic edifices, including lava dome complexes. We present a new probabilistic model for forecasting pyroclastic density currents (PDCs) from lava dome collapse that takes into account the heavy-tailed distribution of the lengths of eruptive phases, the periods of quiescence, and the forecast window of interest. In the hazard analysis, we also consider probabilistic scenario models describing the flow’s volume and initial direction. Further, with the use of statistical emulators, we combine these models with physics-based simulations of PDCs at Soufrière Hills Volcano to produce a series of probabilistic hazard maps for flow inundation over 5, 10, and 20 year periods. The development and application of this assessment approach is the first of its kind for the quantification of periods of diminished volcanic activity. As such, it offers evidence-based guidance for dome collapse hazards that can be used to inform decision-making around provisions of access and reoccupation in areas around volcanoes that are becoming less active over time. 
    more » « less
  2. Each year a growing number of wind farms are being added to power grids to generate sustainable energy. The power curve of a wind turbine, which exhibits the relationship between generated power and wind speed, plays a major role in assessing the performance of a wind farm. Neural networks have been used for power curve estimation. However, they do not produce a confidence measure for their output, unless computationally prohibitive Bayesian methods are used. In this paper, a probabilistic neural network with Monte Carlo dropout is considered to quantify the model or epistemic uncertainty of the power curve estimation. This approach offers a minimal increase in computational complexity and thus evaluation time. Furthermore, by adding a probabilistic loss function, the noise or aleatoric uncertainty in the data is estimated. The developed network captures both model and noise uncertainty which are found to be useful tools in assessing performance. Also, the developed network is compared with the existing ones across a public domain dataset showing superior performance in terms of prediction accuracy. The results obtained indicate that the developed network provides the quantification of uncertainty while maintaining accurate power estimation. 
    more » « less
  3. Abstract Accurate delineation of compound flood hazard requires joint simulation of rainfall‐runoff and storm surges within high‐resolution flood models, which may be computationally expensive. There is a need for supplementing physical models with efficient, probabilistic methodologies for compound flood hazard assessment that can be applied under a range of climate and environment conditions. Here we propose an extension to the joint probability optimal sampling method (JPM‐OS), which has been widely used for storm surge assessment, and apply it for rainfall‐surge compound hazard assessment under climate change at the catchment‐scale. We utilize thousands of synthetic tropical cyclones (TCs) and physics‐based models to characterize storm surge and rainfall hazards at the coast. Then we implement a Bayesian quadrature optimization approach (JPM‐OS‐BQ) to select a small number (∼100) of storms, which are simulated within a high‐resolution flood model to characterize the compound flood hazard. We show that the limited JPM‐OS‐BQ simulations can capture historical flood return levels within 0.25 m compared to a high‐fidelity Monte Carlo approach. We find that the combined impact of 2100 sea‐level rise (SLR) and TC climatology changes on flood hazard change in the Cape Fear Estuary, NC will increase the 100‐year flood extent by 27% and increase inundation volume by 62%. Moreover, we show that probabilistic incorporation of SLR in the JPM‐OS‐BQ framework leads to different 100‐year flood maps compared to using a single mean SLR projection. Our framework can be applied to catchments across the United States Atlantic and Gulf coasts under a variety of climate and environment scenarios. 
    more » « less
  4. Estimating and disentangling epistemic uncertainty, uncertainty that is reducible with more training data, and aleatoric uncertainty, uncertainty that is inherent to the task at hand, is critically important when applying machine learning to highstakes applications such as medical imaging and weather forecasting. Conditional diffusion models’ breakthrough ability to accurately and efficiently sample from the posterior distribution of a dataset now makes uncertainty estimation conceptually straightforward: One need only train and sample from a large ensemble of diffusion models. Unfortunately, training such an ensemble becomes computationally intractable as the complexity of the model architecture grows. In this work we introduce a new approach to ensembling, hyper-diffusion models (HyperDM), which allows one to accurately estimate both epistemic and aleatoric uncertainty with a single model. Unlike existing single-model uncertainty methods like Monte-Carlo dropout and Bayesian neural networks, HyperDM offers prediction accuracy on par with, and in some cases superior to, multi-model ensembles. Furthermore, our proposed approach scales to modern network architectures such as Attention U-Net and yields more accurate uncertainty estimates compared to existing methods. We validate our method on two distinct real-world tasks: x-ray computed tomography reconstruction and weather temperature forecasting. Source code is publicly available at https://github.com/matthewachan/hyperdm. 
    more » « less
  5. Uncertainty decomposition refers to the task of decomposing the total uncertainty of a predictive model into aleatoric (data) uncertainty, resulting from inherent randomness in the data-generating process, and epistemic (model) uncertainty, resulting from missing information in the model’s training data. In large language models (LLMs) specifically, identifying sources of uncertainty is an important step toward improving reliability, trustworthiness, and interpretability, but remains an important open research question. In this paper, we introduce an uncertainty decomposition framework for LLMs, called input clarification ensembling, which can be applied to any pre-trained LLM. Our approach generates a set of clarifications for the input, feeds them into an LLM, and ensembles the corresponding predictions. We show that, when aleatoric uncertainty arises from ambiguity or under-specification in LLM inputs, this approach makes it possible to factor an (un-clarified) LLM’s predictions into separate aleatoric and epistemic terms, using a decomposition similar to the one employed by Bayesian neural networks. Empirical evaluations demonstrate that input clarification ensembling provides accurate and reliable uncertainty quantification on several language processing tasks. 
    more » « less