skip to main content


Title: Novel statistical emulator construction for volcanic ash transport model Ash3d with physically motivated measures
Statistical emulators are a key tool for rapidly producing probabilistic hazard analysis of geophysical processes. Given output data computed for a relatively small number of parameter inputs, an emulator interpolates the data, providing the expected value of the output at untried inputs and an estimate of error at that point. In this work, we propose to fit Gaussian Process emulators to the output from a volcanic ash transport model, Ash3d. Our goal is to predict the simulated volcanic ash thickness from Ash3d at a location of interest using the emulator. Our approach is motivated by two challenges to fitting emulators—characterizing the input wind field and interactions between that wind field and variable grain sizes. We resolve these challenges by using physical knowledge on tephra dispersal. We propose new physically motivated variables as inputs and use normalized output as the response for fitting the emulator. Subsetting based on the initial conditions is also critical in our emulator construction. Simulation studies characterize the accuracy and efficiency of our emulator construction and also reveal its current limitations. Our work represents the first emulator construction for volcanic ash transport models with considerations of the simulated physical process.  more » « less
Award ID(s):
1821311 1821338
NSF-PAR ID:
10284891
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
Volume:
476
Issue:
2242
ISSN:
1364-5021
Page Range / eLocation ID:
20200161
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Observational estimates of Antarctic ice loss have accelerated in recent decades, and worst-case scenarios of modeling studies have suggested potentially catastrophic sea level rise (~2 meters) by the end of the century. However, modeled contributions to global mean sea level from the Antarctic ice-sheet (AIS) in the 21st century are highly uncertain, in part because ice-sheet model parameters are poorly constrained. Individual ice-sheet model runs are also deterministic and not computationally efficient enough to generate the continuous probability distributions required for incorporation into a holistic framework of probabilistic sea-level projections. To address these shortfalls, we statistically emulate an ice-sheet model using Gaussian Process (GP) regression. GP modeling is a non-parametric machine-learning technique which maps inputs (e.g. forcing or model parameters) to target outputs (e.g. sea-level contributions from the Antarctic ice-sheet) and has the inherent and important advantage that emulator uncertainty is explicitly quantified. We construct emulators for the last interglacial period and an RCP8.5 scenario, and separately for the western, eastern, and total AIS. Separate emulation of western and eastern AIS is important because their evolutions and physical responses to climate forcing are distinct. The emulators are trained on 196 ensemble members for each scenario, composed by varying the parameters of maximum rate of ice-cliff wastage and the coefficient of hydrofracturing. We condition the emulators on last interglacial proxy sea-level records and modern GRACE measurements and exclude poor-fitting ensemble members. The resulting emulators are sampled to produce probability distributions that fill intermediate gaps between discrete ice-sheet model outcomes. We invert emulated high and low probability sea-level contributions in 2100 to explore 21st century evolution pathways; results highlight the deep uncertainty of ice-sheet model physics and the importance of using observations to narrow the range of parameters. Our approach is designed to be flexible such that other ice-sheet models or parameter spaces may be substituted and explored with the emulator. 
    more » « less
  2. null (Ed.)
    Recent years have seen the introduction of large- scale platforms for experimental wireless research. These platforms, which include testbeds like those of the PAWR program and emulators like Colosseum, allow researchers to prototype and test their solutions in a sound yet realistic wireless environment before actual deployment. Emulators, in particular, enable wire- less experiments that are not site-specific as those on real testbeds. Researchers can choose among different radio frequency (RF) scenarios for real-time emulation of a vast variety of different situations, with different numbers of users, RF bandwidth, antenna counts, hardware requirements, etc. Although very powerful, in that they can emulate virtually any real-world deployment, emulated scenarios are only as useful as how accurately they can capture the targeted wireless channel and environment. Achieving emulation accuracy is particularly challenging, especially for experiments at scale for which emulators require considerable amounts of computational resources. In this paper we propose a framework to create RF scenarios for emulators like Colosseum from rich forms of inputs, like those obtained by measurements through radio equipment or via software (e.g., ray-tracers and electromagnetic field solvers). Our framework optimally scales down the large set of RF data in input to the fewer parameters allowed by the emulator by using efficient clustering techniques and channel impulse response re-sampling. We showcase our method by generating wireless scenarios for Colosseum by using Remcom’s Wireless InSite, a commercial-grade ray-tracer that produces key characteristics of the wireless channel. Examples are provided for line-of-sight and non-line-of-sight scenarios on portions of the Northeastern University main campus. 
    more » « less
  3. Abstract Energy 3D printing processes have enabled energy storage devices with complex structures, high energy density, and high power density. Among these processes, Freeze Nano Printing (FNP) has risen as a promising process. However, quality problems are among the biggest barriers for FNP. Particularly, the droplet solidification time in FNP governs thermal distribution, and subsequently determines product solidification, formation, and quality. To describe the solidification time, physical-based heat transfer model is built. But it is computationally intensive. The objective of this work is to build an efficient emulator for the physical model. There are several challenges unaddressed: 1) the solidification time at various locations, which is a multi-dimensional array response, needs to be modeled; 2) the construction and evaluation of the emulator at new process settings need to be quick and accurate. We integrate joint tensor decomposition and Nearest Neighbor Gaussian Process (NNGP) to construct an efficient multi-dimensional array response emulator with process settings as inputs. Specifically, structured joint tensor decomposition decomposes the multi-dimensional array responses at various process settings into the setting-specific core tensors and shared low dimensional factorization matrices. Then, each independent entry of the core tensor is modeled with an NNGP, which addresses the computationally intensive model estimation problem by sampling the nearest neighborhood samples. Finally, tensor reconstruction is performed to make predictions of solidification time for new process settings. The proposed framework is demonstrated by emulating the physical model of FNP, and compared with alternative tensor (multi-dimensional array) regression models. 
    more » « less
  4. Abstract. Plume-SPH provides the first particle-based simulation ofvolcanic plumes. Smoothed particle hydrodynamics (SPH) has several advantagesover currently used mesh-based methods in modeling of multiphase freeboundary flows like volcanic plumes. This tool will provide more accurateeruption source terms to users of volcanic ash transport anddispersion models (VATDs), greatly improving volcanic ash forecasts. The accuracy ofthese terms is crucial for forecasts from VATDs, and the 3-D SPH modelpresented here will provide better numerical accuracy. As an initial effortto exploit the feasibility and advantages of SPH in volcanic plume modeling,we adopt a relatively simple physics model (3-D dusty-gas dynamic modelassuming well-mixed eruption material, dynamic equilibrium and thermodynamicequilibrium between erupted material and air that entrained into the plume,and minimal effect of winds) targeted at capturing the salient features of avolcanic plume. The documented open-source code is easily obtained andextended to incorporate other models of physics of interest to the largecommunity of researchers investigating multiphase free boundary flows ofvolcanic or other origins.

    The Plume-SPH code (https://doi.org/10.5281/zenodo.572819) also incorporates several newly developed techniques inSPH needed to address numerical challenges in simulating multiphasecompressible turbulent flow. The code should thus be also of general interestto the much larger community of researchers using and developing SPH-basedtools. In particular, the SPHε turbulence model is used to capturemixing at unresolved scales. Heat exchange due to turbulence is calculated bya Reynolds analogy, and a corrected SPH is used to handle tensile instabilityand deficiency of particle distribution near the boundaries. We alsodeveloped methodology to impose velocity inlet and pressure outlet boundaryconditions, both of which are scarce in traditional implementations of SPH.

    The core solver of our model is parallelized with the message passinginterface (MPI) obtaining good weak and strong scalability using novel techniquesfor data management using space-filling curves (SFCs), object creationtime-based indexing and hash-table-based storage schemes. These techniques areof interest to researchers engaged in developing particles in cell-typemethods. The code is first verified by 1-D shock tube tests, then bycomparing velocity and concentration distribution along the central axis andon the transverse cross with experimental results of JPUE (jet or plume thatis ejected from a nozzle into a uniform environment). Profiles of severalintegrated variables are compared with those calculated by existing 3-D plumemodels for an eruption with the same mass eruption rate (MER) estimated forthe Mt. Pinatubo eruption of 15 June 1991. Our results are consistent withexisting 3-D plume models. Analysis of the plume evolution processdemonstrates that this model is able to reproduce the physics of plumedevelopment.

     
    more » « less
  5. Abstract

    Ideally, probabilistic hazard assessments combine available knowledge about physical mechanisms of the hazard, data on past hazards, and any precursor information. Systematically assessing the probability of rare, yet catastrophic hazards adds a layer of difficulty due to limited observation data. Via computer models, one can exercise potentially dangerous scenarios that may not have happened in the past but are probabilistically consistent with the aleatoric nature of previous volcanic behavior in the record. Traditional Monte Carlo‐based methods to calculate such hazard probabilities suffer from two issues: they are computationally expensive, and they are static. In light of new information, newly available data, signs of unrest, and new probabilistic analysis describing uncertainty about scenarios the Monte Carlo calculation would need to be redone under the same computational constraints. Here we present an alternative approach utilizing statistical emulators that provide an efficient way to overcome the computational bottleneck of typical Monte Carlo approaches. Moreover, this approach is independent of an aleatoric scenario model and yet can be applied rapidly to any scenario model making it dynamic. We present and apply this emulator‐based approach to create multiple probabilistic hazard maps for inundation of pyroclastic density currents in the Long Valley Volcanic Region. Further, we illustrate how this approach enables an exploration of the impact of epistemic uncertainties on these probabilistic hazard forecasts. Particularly, we focus on the uncertainty of vent opening models and how that uncertainty both aleatoric and epistemic impacts the resulting probabilistic hazard maps of pyroclastic density current inundation.

     
    more » « less