skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Stochastic Sensitivities across Scales and Physics
The polynomial chaos expansions (PCE) provide stochastic representations of quantities of interest (QoI) in terms of a vector of standardized random variables that represent all uncertainties influencing the QoI. These uncertainties could reflect statistical scatter in estimated probabilistic model (of which the mean, variance, or PCE coefficients are but examples), or errors in the underlying functional model between input and output (e.g. physics models). In this paper, we show how PCE permit the evaluation of sensitivities with respect to all these uncertainties, and provide a rational paradigm for resource allocation aimed at model validation. We will demonstrate the methodologies on examples drawn across science and engineering.  more » « less
Award ID(s):
1661052
PAR ID:
10108069
Author(s) / Creator(s):
;
Date Published:
Journal Name:
EMI 2019
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Uncertainty quantification (UQ) is an important part of mathematical modeling and simulations, which quantifies the impact of parametric uncertainty on model predictions. This paper presents an efficient approach for polynomial chaos expansion (PCE) based UQ method in biological systems. For PCE, the key step is the stochastic Galerkin (SG) projection, which yields a family of deterministic models of PCE coefficients to describe the original stochastic system. When dealing with systems that involve nonpolynomial terms and many uncertainties, the SG-based PCE is computationally prohibitive because it often involves high-dimensional integrals. To address this, a generalized dimension reduction method (gDRM) is coupled with quadrature rules to convert a high-dimensional integral in the SG into a few lower dimensional ones that can be rapidly solved. The performance of the algorithm is validated with two examples describing the dynamic behavior of cells. Compared to other UQ techniques (e.g., nonintrusive PCE), the results show the potential of the algorithm to tackle UQ in more complicated biological systems. 
    more » « less
  2. null (Ed.)
    Uncertainty is a common feature in first-principles models that are widely used in various engineering problems. Uncertainty quantification (UQ) has become an essential procedure to improve the accuracy and reliability of model predictions. Polynomial chaos expansion (PCE) has been used as an efficient approach for UQ by approximating uncertainty with orthogonal polynomial basis functions of standard distributions (e.g., normal) chosen from the Askey scheme. However, uncertainty in practice may not be represented well by standard distributions. In this case, the convergence rate and accuracy of the PCE-based UQ cannot be guaranteed. Further, when models involve non-polynomial forms, the PCE-based UQ can be computationally impractical in the presence of many parametric uncertainties. To address these issues, the Gram–Schmidt (GS) orthogonalization and generalized dimension reduction method (gDRM) are integrated with the PCE in this work to deal with many parametric uncertainties that follow arbitrary distributions. The performance of the proposed method is demonstrated with three benchmark cases including two chemical engineering problems in terms of UQ accuracy and computational efficiency by comparison with available algorithms (e.g., non-intrusive PCE). 
    more » « less
  3. Abstract In this work, we describe a new approach that uses variational encoder-decoder (VED) networks for efficient uncertainty quantification forgoal-orientedinverse problems. Contrary to standard inverse problems, these approaches are goal-oriented in that the goal is to estimate some quantities of interest (QoI) that are functions of the solution of an inverse problem, rather than the solution itself. Moreover, we are interested in computing uncertainty metrics associated with the QoI, thus utilizing a Bayesian approach for inverse problems that incorporates the prediction operator and techniques for exploring the posterior. This may be particularly challenging, especially for nonlinear, possibly unknown, operators and nonstandard prior assumptions. We harness recent advances in machine learning, i.e. VED networks, to describe a data-driven approach to large-scale inverse problems. This enables a real-time uncertainty quantification for the QoI. One of the advantages of our approach is that we avoid the need to solve challenging inversion problems by training a network to approximate the mapping from observations to QoI. Another main benefit is that we enable uncertainty quantification for the QoI by leveraging probability distributions in the latent and target spaces. This allows us to efficiently generate QoI samples and circumvent complicated or even unknown forward models and prediction operators. Numerical results from medical tomography reconstruction and nonlinear hydraulic tomography demonstrate the potential and broad applicability of the approach. 
    more » « less
  4. We introduce the notion of Quality of Indicator (QoI) to assess the level of contribution by participants in threat intelligence sharing. We exemplify QoI by metrics of the correctness, relevance, utility, and uniqueness of indicators. We build a system that extrapolates the metrics using a machine learning process over a reference set of indicators. We compared these results against a model that only considers the volume of information as a metric for contribution, and unveiled various observations, including the ability to spot low-quality contributions that are synonymous to free-riding. 
    more » « less
  5. null (Ed.)
    Abstract Numerical simulations for computational hemodynamics in clinical settings require a combination of many ingredients, mathematical models, solvers and patient-specific data. The sensitivity of the solutions to these factors may be critical, particularly when we have a partial or noisy knowledge of data. Uncertainty quantification is crucial to assess the reliability of the results. We present here an extensive sensitivity analysis in aortic flow simulations, to quantify the dependence of clinically relevant quantities to the patient-specific geometry and the inflow boundary conditions. Geometry and inflow conditions are generally believed to have a major impact on numerical simulations. We resort to a global sensitivity analysis, (i.e., not restricted to a linearization around a working point), based on polynomial chaos expansion (PCE) and the associated Sobol' indices. We regard the geometry and the inflow conditions as the realization of a parametric stochastic process. To construct a physically consistent stochastic process for the geometry, we use a set of longitudinal-in-time images of a patient with an abdominal aortic aneurysm (AAA) to parametrize geometrical variations. Aortic flow is highly disturbed during systole. This leads to high computational costs, even amplified in a sensitivity analysis -when many simulations are needed. To mitigate this, we consider here a large Eddy simulation (LES) model. Our model depends in particular on a user-defined parameter called filter radius. We borrowed the tools of the global sensitivity analysis to assess the sensitivity of the solution to this parameter too. The targeted quantities of interest (QoI) include: the total kinetic energy (TKE), the time-average wall shear stress (TAWSS), and the oscillatory shear index (OSI). The results show that these indexes are mostly sensitive to the geometry. Also, we find that the sensitivity may be different during different instants of the heartbeat and in different regions of the domain of interest. This analysis helps to assess the reliability of in silico tools for clinical applications. 
    more » « less