skip to main content


Title: Scalable Stochastic Programming with Bayesian Hybrid Models
Bayesian hybrid models (BHMs) fuse physics-based insights with machine learning constructs to correct for systematic bias. In this paper, we demonstrate a scalable computational strategy to embed BHMs in an equation-oriented modelling environment. Thus, this paper generalizes stochastic programming, which traditionally focuses on aleatoric uncertainty (as characterized by a probability distribution for uncertainty model parameters) to also consider epistemic uncertainty, i.e., mode-form uncertainty or systematic bias as modelled by the Gaussian process in the BHM. As an illustrative example, we consider ballistic firing using a BHM that includes a simplified glass-box (i.e., equation-oriented) model that neglects air resistance and a Gaussian process model to account for systematic bias (i.e., epistemic or model-form uncertainty) induced from the model simplification. The gravity parameter and the GP hypermeters are inferred from data in a Bayesian framework, yielding a posterior distribution. A novel single-stage stochastic program formulation using the posterior samples and Gaussian quadrature rules is proposed to compute the optimal decisions (e.g., firing angle and velocity) that minimize the expected value of an objective (e.g., distance from a stationary target). PySMO is used to generate expressions for the GP prediction mean and uncertainty in Pyomo, enabling efficient optimization with gradient-based solvers such as Ipopt. A scaling study characterizes the solver time and number of iterations for up to 2,000 samples from the posterior.  more » « less
Award ID(s):
1941596
NSF-PAR ID:
10403694
Author(s) / Creator(s):
; ;
Editor(s):
Yamashita, Y.; Kano, M.
Date Published:
Journal Name:
Computer aided chemical engineering
Volume:
49
ISSN:
2543-1331
Page Range / eLocation ID:
1309-1314
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Bayesian hybrid models fuse physics-based insights with machine learning constructs to correct for systematic bias. In this paper, we compare Bayesian hybrid models against physics-based glass-box and Gaussian process black-box surrogate models. We consider ballistic firing as an illustrative case study for a Bayesian decision-making workflow. First, Bayesian calibration is performed to estimate model parameters. We then use the posterior distribution from Bayesian analysis to compute optimal firing conditions to hit a target via a single-stage stochastic program. The case study demonstrates the ability of Bayesian hybrid models to overcome systematic bias from missing physics with fewer data than the pure machine learning approach. Ultimately, we argue Bayesian hybrid models are an emerging paradigm for data-informed decision-making under parametric and epistemic uncertainty. 
    more » « less
  2. Hybrid (i.e., grey-box) models are a powerful and flexible paradigm for predictive science and engineering. Grey-box models use data-driven constructs to incorporate unknown or computationally intractable phenomena into glass-box mechanistic models. The pioneering work of statisticians Kennedy and O’Hagan introduced a new paradigm to quantify epistemic (i.e., model-form) uncertainty. While popular in several engineering disciplines, prior work using Kennedy–O’Hagan hybrid models focuses on prediction with accurate uncertainty estimates. This work demonstrates computational strategies to deploy Bayesian hybrid models for optimization under uncertainty. Specifically, the posterior distributions of Bayesian hybrid models provide a principled uncertainty set for stochastic programming, chance-constrained optimization, or robust optimization. Through two illustrative case studies, we demonstrate the efficacy of hybrid models, composed of a structurally inadequate glass-box model and Gaussian process bias correction term, for decision-making using limited training data. From these case studies, we develop recommended best practices and explore the trade-offs between different hybrid model architectures. 
    more » « less
  3. A broad class of stochastic volatility models are defined by systems of stochastic differential equations, and while these models have seen widespread success in domains such as finance and statistical climatology, they typically lack an ability to condition on historical data to produce a true posterior distribution. To address this fundamental limitation, we show how to re-cast a class of stochastic volatility models as a hierarchical Gaussian process (GP) model with specialized covariance functions. This GP model retains the inductive biases of the stochastic volatility model while providing the posterior predictive distribution given by GP inference. Within this framework, we take inspiration from well studied domains to introduce a new class of models, Volt and Magpie, that significantly outperform baselines in stock and wind speed forecasting, and naturally extend to the multitask setting. 
    more » « less
  4. We address uncertainty quantification for Gaussian processes (GPs) under misspecified priors, with an eye towards Bayesian Optimization (BO). GPs are widely used in BO because they easily enable exploration based on posterior uncertainty bands. However, this convenience comes at the cost of robustness: a typical function encountered in practice is unlikely to have been drawn from the data scientist’s prior, in which case uncertainty estimates can be misleading, and the resulting exploration can be suboptimal. We present a frequentist approach to GP/BO uncertainty quantification. We utilize the GP framework as a working model, but do not assume correctness of the prior. We instead construct a \emph{confidence sequence} (CS) for the unknown function using martingale techniques. There is a necessary cost to achieving robustness: if the prior was correct, posterior GP bands are narrower than our CS. Nevertheless, when the prior is wrong, our CS is statistically valid and empirically outperforms standard GP methods, in terms of both coverage and utility for BO. Additionally, we demonstrate that powered likelihoods provide robustness against model misspecification. 
    more » « less
  5. Abstract

    We present a Bayesian hierarchical space‐time stochastic weather generator (BayGEN) to generate daily precipitation and minimum and maximum temperatures. BayGEN employs a hierarchical framework with data, process, and parameter layers. In the data layer, precipitation occurrence at each site is modeled using probit regression using a spatially distributed latent Gaussian process; precipitation amounts are modeled as gamma random variables; and minimum and maximum temperatures are modeled as realizations from Gaussian processes. The latent Gaussian process that drives the precipitation occurrence process is modeled in the process layer. In the parameter layer, the model parameters of the data and process layers are modeled as spatially distributed Gaussian processes, consequently enabling the simulation of daily weather at arbitrary (unobserved) locations or on a regular grid. All model parameters are endowed with weakly informative prior distributions. The No‐U Turn sampler, an adaptive form of Hamiltonian Monte Carlo, is used to maximize the model likelihood function and obtain posterior samples of each parameter. Posterior samples of the model parameters propagate uncertainty to the weather simulations, an important feature that makes BayGEN unique compared to traditional weather generators. We demonstrate the utility of BayGEN with application to daily weather generation in a basin of the Argentine Pampas. Furthermore, we evaluate the implications of crop yield by driving a crop simulation model with weather simulations from BayGEN and an equivalent non‐Bayesian weather generator.

     
    more » « less