skip to main content


Title: Improving the Interpretability of Physics-Based Bias in Material Models
Abstract

In order to accurately predict the performance of materials under dynamic loading conditions, models have been developed that describe the rate-dependent material behavior and irrecoverable plastic deformation that occurs at elevated strains and applied loads. Most of these models have roots in empirical fits to data and, thus, require the addition of specific parameters that reflect the properties and response of specific materials. In this work, we present a systematic approach to the problem of calibrating a Johnson-Cook plasticity model for 304L stainless steel using experimental testing in which the parameters are treated as dependent on the state of the material and uncovered using experimental data. The results obtained indicate that the proposed approach can make the presence of a discrepancy term in calibration unnecessary and, at the same time, improve the prediction accuracy of the model into new input domains and provide improved understanding of model bias compared to calibration with stationary parameter values.

 
more » « less
Award ID(s):
1633608
PAR ID:
10341180
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
ASME 2020 Verification and Validation Symposium
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Data‐driven, machine learning (ML)‐assisted approaches have been used to study structure‐property relationships at the atomic scale, which have greatly accelerated the screening process and new material discovery. However, such approaches are not easily applicable to modulating material properties of a soft material in a laboratory with specific ingredients. Moreover, it is desirable to relate material properties directly to the experimental recipes. Herein, a data‐driven approach to tailoring mechanical properties of a soft material is demonstrated using ML‐assisted predictions of mechanical properties based on experimental synthetic recipes. Polyurethane (PU) elastomer is used as a model soft material to demonstrate the approach and experimentally varied mechanical properties of the PU elastomer by modulating the mixing ratio between components of the elastomer. Twenty‐five experimental conditions are selected based on the design of experiment and use those data points to train a linear regression model. The resulting model takes desired mechanical properties as input and returns synthetic recipes of a soft material, which is subsequently validated by experiments. Lastly, the prediction accuracies of different machine learning algorithms is compared. It is believed that the approach is widely applicable to other material systems to establish experimental conditions and material property relationships for soft materials.

     
    more » « less
  2. Abstract

    A Bayesian optimization procedure is presented for calibrating a multi-mechanism micromechanical model for creep to experimental data of F82H steel. Reduced activation ferritic martensitic (RAFM) steels based on Fe(8-9)%Cr are the most promising candidates for some fusion reactor structures. Although there are indications that RAFM steel could be viable for fusion applications at temperatures up to 600 °C, the maximum operating temperature will be determined by the creep properties of the structural material and the breeder material compatibility with the structural material. Due to the relative paucity of available creep data on F82H steel compared to other alloys such as Grade 91 steel, micromechanical models are sought for simulating creep based on relevant deformation mechanisms. As a point of departure, this work recalibrates a model form that was previously proposed for Grade 91 steel to match creep curves for F82H steel. Due to the large number of parameters (9) and cost of the nonlinear simulations, an automated approach for tuning the parameters is pursued using a recently developed Bayesian optimization for functional output (BOFO) framework [1]. Incorporating extensions such as batch sequencing and weighted experimental load cases into BOFO, a reasonably small error between experimental and simulated creep curves at two load levels is achieved in a reasonable number of iterations. Validation with an additional creep curve provides confidence in the fitted parameters obtained from the automated calibration procedure to describe the creep behavior of F82H steel.

     
    more » « less
  3. Key points

    Induced pluripotent stem cell‐derived cardiomyocytes (iPSC‐CMs) capture patient‐specific genotype–phenotype relationships, as well as cell‐to‐cell variability of cardiac electrical activity

    Computational modelling and simulation provide a high throughput approach to reconcile multiple datasets describing physiological variability, and also identify vulnerable parameter regimes

    We have developed a whole‐cell model of iPSC‐CMs, composed of single exponential voltage‐dependent gating variable rate constants, parameterized to fit experimental iPSC‐CM outputs

    We have utilized experimental data across multiple laboratories to model experimental variability and investigate subcellular phenotypic mechanisms in iPSC‐CMs

    This framework links molecular mechanisms to cellular‐level outputs by revealing unique subsets of model parameters linked to known iPSC‐CM phenotypes

    Abstract

    There is a profound need to develop a strategy for predicting patient‐to‐patient vulnerability in the emergence of cardiac arrhythmia. A promisingin vitromethod to address patient‐specific proclivity to cardiac disease utilizes induced pluripotent stem cell‐derived cardiomyocytes (iPSC‐CMs). A major strength of this approach is that iPSC‐CMs contain donor genetic information and therefore capture patient‐specific genotype–phenotype relationships. A cited detriment of iPSC‐CMs is the cell‐to‐cell variability observed in electrical activity. We postulated, however, that cell‐to‐cell variability may constitute a strength when appropriately utilized in a computational framework to build cell populations that can be employed to identify phenotypic mechanisms and pinpoint key sensitive parameters. Thus, we have exploited variation in experimental data across multiple laboratories to develop a computational framework for investigating subcellular phenotypic mechanisms. We have developed a whole‐cell model of iPSC‐CMs composed of simple model components comprising ion channel models with single exponential voltage‐dependent gating variable rate constants, parameterized to fit experimental iPSC‐CM data for all major ionic currents. By optimizing ionic current model parameters to multiple experimental datasets, we incorporate experimentally‐observed variability in the ionic currents. The resulting population of cellular models predicts robust inter‐subject variability in iPSC‐CMs. This approach links molecular mechanisms to known cellular‐level iPSC‐CM phenotypes, as shown by comparing immature and mature subpopulations of models to analyse the contributing factors underlying each phenotype. In the future, the presented models can be readily expanded to include genetic mutations and pharmacological interventions for studying the mechanisms of rare events, such as arrhythmia triggers.

     
    more » « less
  4. Abstract

    Parameters in climate models are usually calibrated manually, exploiting only small subsets of the available data. This precludes both optimal calibration and quantification of uncertainties. Traditional Bayesian calibration methods that allow uncertainty quantification are too expensive for climate models; they are also not robust in the presence of internal climate variability. For example, Markov chain Monte Carlo (MCMC) methods typically requiremodel runs and are sensitive to internal variability noise, rendering them infeasible for climate models. Here we demonstrate an approach to model calibration and uncertainty quantification that requires onlymodel runs and can accommodate internal climate variability. The approach consists of three stages: (a) a calibration stage uses variants of ensemble Kalman inversion to calibrate a model by minimizing mismatches between model and data statistics; (b) an emulation stage emulates the parameter‐to‐data map with Gaussian processes (GP), using the model runs in the calibration stage for training; (c) a sampling stage approximates the Bayesian posterior distributions by sampling the GP emulator with MCMC. We demonstrate the feasibility and computational efficiency of this calibrate‐emulate‐sample (CES) approach in a perfect‐model setting. Using an idealized general circulation model, we estimate parameters in a simple convection scheme from synthetic data generated with the model. The CES approach generates probability distributions of the parameters that are good approximations of the Bayesian posteriors, at a fraction of the computational cost usually required to obtain them. Sampling from this approximate posterior allows the generation of climate predictions with quantified parametric uncertainties.

     
    more » « less
  5. The stochastic modeling and calibration of an anisotropic elasto-plastic model for additive manufacturing materials are addressed in this work. We specifically focus on 316L stainless steel, produced by directed energy deposition. Tensile specimens machined from two additive manufactured (AM) box-structures were used to characterize material anisotropy and random spatial variations in elasticity and plasticity material parameters. Tensile specimens were cut parallel (horizontal) and perpendicular (vertical) to the AM deposition plane and were indexed by location. These results show substantial variability in both regimes, with fluctuation levels that differ between specimens loaded in the parallel and perpendicular build directions. Stochastic representations for the stiffness and Hill’s criterion coefficients random fields are presented next. Information-theoretic models are derived within the class of translation random fields, with the aim of promoting identifiability with limited data. The approach allows for the constitutive models to be generated on arbitrary geometries, using the so- called stochastic partial differential approach (to sampling). These representations are then partially calibrated using the aforementioned experimental results, hence enabling subsequent propagation analyses. Sampling is finally exemplified on the considered structure. 
    more » « less