skip to main content


Title: Stochastic emulation with enhanced partial‐ and no‐replication strategies for seismic response distribution estimation
Abstract

Modern performance earthquake engineering practices frequently require a large number of time‐consuming non‐linear time‐history simulations to appropriately address excitation and structural uncertainties when estimating engineering demand parameter (EDP) distributions. Surrogate modeling techniques have emerged as an attractive tool for alleviating such high computational burden in similar engineering problems. A key challenge for the application of surrogate models in earthquake engineering context relates to the aleatoric variability associated with the seismic hazard. This variability is typically expressed as high‐dimensional or non‐parametric uncertainty, and so cannot be easily incorporated within standard surrogate modeling frameworks. Rather, a surrogate modeling approach that can directly approximate the full distribution of the response output is warranted for this application. This approach needs to additionally address the fact that the response variability may change as input parameter changes, yielding a heteroscedastic behavior. Stochastic emulation techniques have emerged as a viable solution to accurately capture aleatoric uncertainties in similar contexts, and recent work by the second author has established a framework to accommodate this for earthquake engineering applications, using Gaussian Process (GP) regression to predict the EDP response distribution. The established formulation requires for a portion of the training samples the replication of simulations for different descriptions of the aleatoric uncertainty. In particular, the replicated samples are used to build a secondary GP model to predict the heteroscedastic characteristics, and these predictions are then used to formulate the primary GP that produces the full EDP distribution. This practice, however, has two downsides: it always requires minimum replications when training the secondary GP, and the information from the non‐replicated samples is utilized only for the primary GP. This research adopts an alternative stochastic GP formulation that can address both limitations. To this end, the secondary GP is trained by measuring the square of sample deviations from the mean instead of the crude sample variances. To establish the primitive mean estimates, another auxiliary GP is introduced. This way, information from all replicated and non‐replicated samples is fully leveraged for estimating both the EDP distribution and the underlying heteroscedastic behavior, while formulation accommodates an implementation using no replications. The case study examples using three different stochastic ground motion models demonstrate that the proposed approach can address both aforementioned challenges.

 
more » « less
NSF-PAR ID:
10505142
Author(s) / Creator(s):
 ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
Earthquake Engineering & Structural Dynamics
Volume:
53
Issue:
7
ISSN:
0098-8847
Format(s):
Medium: X Size: p. 2354-2381
Size(s):
p. 2354-2381
Sponsoring Org:
National Science Foundation
More Like this
  1. Yamashita, Y. ; Kano, M. (Ed.)
    Bayesian hybrid models (BHMs) fuse physics-based insights with machine learning constructs to correct for systematic bias. In this paper, we demonstrate a scalable computational strategy to embed BHMs in an equation-oriented modelling environment. Thus, this paper generalizes stochastic programming, which traditionally focuses on aleatoric uncertainty (as characterized by a probability distribution for uncertainty model parameters) to also consider epistemic uncertainty, i.e., mode-form uncertainty or systematic bias as modelled by the Gaussian process in the BHM. As an illustrative example, we consider ballistic firing using a BHM that includes a simplified glass-box (i.e., equation-oriented) model that neglects air resistance and a Gaussian process model to account for systematic bias (i.e., epistemic or model-form uncertainty) induced from the model simplification. The gravity parameter and the GP hypermeters are inferred from data in a Bayesian framework, yielding a posterior distribution. A novel single-stage stochastic program formulation using the posterior samples and Gaussian quadrature rules is proposed to compute the optimal decisions (e.g., firing angle and velocity) that minimize the expected value of an objective (e.g., distance from a stationary target). PySMO is used to generate expressions for the GP prediction mean and uncertainty in Pyomo, enabling efficient optimization with gradient-based solvers such as Ipopt. A scaling study characterizes the solver time and number of iterations for up to 2,000 samples from the posterior. 
    more » « less
  2. Quantification and propagation of aleatoric uncertainties distributed in complex topological structures remain a challenge. Existing uncertainty quantification and propagation approaches can only handle parametric uncertainties or high dimensional random quantities distributed in a simply connected spatial domain. There lacks a systematic method that captures the topological characteristics of the structural domain in uncertainty analysis. Therefore, this paper presents a new methodology that quantifies and propagates aleatoric uncertainties, such as the spatially varying local material properties and defects, distributed in a topological spatial domain. We propose a new random field-based uncertainty representation approach that captures the topological characteristics using the shortest interior path distance. Parameterization methods like PPCA and β-Variational Autoencoder (βVAE) are employed to convert the random field representation of uncertainty to a small set of independent random variables. Then non-intrusive uncertainties propagation methods such as polynomial chaos expansion and univariate dimension reduction are employed to propagate the parametric uncertainties to the output of the problem. The effectiveness of the proposed methodology is demonstrated by engineering case studies. The accuracy and computational efficiency of the proposed method is confirmed by comparing with the reference values of Monte Carlo simulations with a sufficiently large number of samples. 
    more » « less
  3. This work focuses on the representation of model-form uncertainties in phase-field models of brittle fracture. Such uncertainties can arise from the choice of the degradation function for instance, and their consideration has been unaddressed to date. The stochastic modeling framework leverages recent developments related to the analysis of nonlinear dynamical systems and relies on the construction of a stochastic reduced-order model. In the latter, a POD-based reduced-order basis is randomized using Riemannian projection and retraction operators, as well as an information-theoretic formulation enabling proper concentration in the convex hull defined by a set of model proposals. The model thus obtained is mathematically admissible in the almost sure sense and involves a low-dimensional hyperparameter, the calibration of which is facilitated through the formulation of a quadratic programming problem. The relevance of the modeling approach is further assessed on one- and two-dimensional applications. It is shown that model uncertainties can be efficiently captured and propagated to macroscopic quantities of interest. An extension based on localized randomization is also proposed to handle the case where the forward simulation is highly sensitive to sample localization. This work constitutes a methodological development allowing phase-field predictions to be endowed with statistical measures of confidence, accounting for the variability induced by modeling choices. 
    more » « less
  4. A broad class of stochastic volatility models are defined by systems of stochastic differential equations, and while these models have seen widespread success in domains such as finance and statistical climatology, they typically lack an ability to condition on historical data to produce a true posterior distribution. To address this fundamental limitation, we show how to re-cast a class of stochastic volatility models as a hierarchical Gaussian process (GP) model with specialized covariance functions. This GP model retains the inductive biases of the stochastic volatility model while providing the posterior predictive distribution given by GP inference. Within this framework, we take inspiration from well studied domains to introduce a new class of models, Volt and Magpie, that significantly outperform baselines in stock and wind speed forecasting, and naturally extend to the multitask setting. 
    more » « less
  5. While Bayesian inference is the gold standard for uncertainty quantification and propagation, its use within physical chemistry encounters formidable computational barriers. These bottlenecks are magnified for modeling data with many independent variables, such as X-ray/neutron scattering patterns and electromagnetic spectra. To address this challenge, we employ local Gaussian process (LGP) surrogate models to accelerate Bayesian optimization over these complex thermophysical properties. The time-complexity of the LGPs scales linearly in the number of independent variables, in stark contrast to the computationally expensive cubic scaling of conventional Gaussian processes. To illustrate the method, we trained a LGP surrogate model on the radial distribution function of liquid neon and observed a 1,760,000-fold speed-up compared to molecular dynamics simulation, beating a conventional GP by three orders-of-magnitude. We conclude that LGPs are robust and efficient surrogate models poised to expand the application of Bayesian inference in molecular simulations to a broad spectrum of experimental data. 
    more » « less