skip to main content

Title: Sensitivity-based research prioritization through stochastic characterization modeling
Purpose Product developers using life cycle toxicity characterization models to understand the potential impacts of chemical emissions face serious challenges related to large data demands and high input data uncertainty. This motivates greater focus on model sensitivity toward input parameter variability to guide research efforts in data refinement and design of experiments for existing and emerging chemicals alike. This study presents a sensitivity-based approach for estimating toxicity characterization factors given high input data uncertainty and using the results to prioritize data collection according to parameter influence on characterization factors (CFs). Proof of concept is illustrated with the UNEP-SETAC scientific consensus model USEtox. Methods Using Monte Carlo analysis, we demonstrate a sensitivity-based approach to prioritize data collection with an illustrative example of aquatic ecotoxicity CFs for the vitamin B derivative niacinamide, which is an antioxidant used in personal care products. We calculate CFs via 10,000 iterations assuming plus-or-minus one order of magnitude variability in fate and exposure-relevant data inputs, while uncertainty in effect factor data is modeled as a central t distribution. Spearman’s rank correlation indices are used for all variable inputs to identify parameters with the largest influence on CFs. Results and discussion For emissions to freshwater, the niacinamide CF is near log-normally distributed with more » a geometric mean of 0.02 and geometric standard deviation of 8.5 PAF m3 day/kg. Results of Spearman’s rank correlation show that degradation rates in air, water, and soil are the most influential parameters in calculating CFs, thus benefiting the most from future data refinement and experimental research. Kow, sediment degradation rate, and vapor pressure were the least influential parameters on CF results. These results may be very different for other, e.g., more lipophilic chemicals, where Kow is known to drive many fate and exposure aspects in multimedia modeling. Furthermore, non-linearity between input parameters and CF results prevents transferring sensitivity conclusions from one chemical to another. Conclusions A sensitivity-based approach for data refinement and research prioritization can provide guidance to database managers, life cycle assessment practitioners, and experimentalists to concentrate efforts on the few parameters that are most influential on toxicity characterization model results. Researchers can conserve resources and address parameter uncertainty by applying this approach when developing new or refining existing CFs for the inventory items that contribute most to toxicity impacts. « less
Authors:
; ; ; ;
Award ID(s):
1140190
Publication Date:
NSF-PAR ID:
10047480
Journal Name:
The International Journal of Life Cycle Assessment
ISSN:
0948-3349
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Computational methods are increasingly being incorporated into the exploitation of microstructure–property relationships for microstructure-sensitive design of materials. In the present work, we propose non-intrusive materials informatics methods for the high-throughput exploration and analysis of a synthetic microstructure space using a machine learning-reinforced multi-phase-field modeling scheme. We specifically study the interface energy space as one of the most uncertain inputs in phase-field modeling and its impact on the shape and contact angle of a growing phase during heterogeneous solidification of secondary phase between solid and liquid phases. We evaluate and discuss methods for the study of sensitivity and propagation of uncertainty in these input parameters as reflected on the shape of the Cu6Sn5intermetallic during growth over the Cu substrate inside the liquid Sn solder due to uncertain interface energies. The sensitivity results rankσSI,σIL, andσIL, respectively, as the most influential parameters on the shape of the intermetallic. Furthermore, we use variational autoencoder, a deep generative neural network method, and label spreading, a semi-supervised machine learning method for establishing correlations between inputs of outputs of the computational model. We clustered the microstructures into three categories (“wetting”, “dewetting”, and “invariant”) using the label spreading method and compared it with the trend observed inmore »the Young-Laplace equation. On the other hand, a structure map in the interface energy space is developed that showsσSIandσSLalter the shape of the intermetallic synchronously where an increase in the latter and decrease in the former changes the shape from dewetting structures to wetting structures. The study shows that the machine learning-reinforced phase-field method is a convenient approach to analyze microstructure design space in the framework of the ICME.

    « less
  2. In this work, generalized polynomial chaos (gPC) expansion for land surface model parameter estimation is evaluated. We perform inverse modeling and compute the posterior distribution of the critical hydrological parameters that are subject to great uncertainty in the Community Land Model (CLM) for a given value of the output LH. The unknown parameters include those that have been identified as the most influential factors on the simulations of surface and subsurface runoff, latent and sensible heat fluxes, and soil moisture in CLM4.0. We set up the inversion problem in the Bayesian framework in two steps: (i) building a surrogate model expressing the input–output mapping, and (ii) performing inverse modeling and computing the posterior distributions of the input parameters using observation data for a given value of the output LH. The development of the surrogate model is carried out with a Bayesian procedure based on the variable selection methods that use gPC expansions. Our approach accounts for bases selection uncertainty and quantifies the importance of the gPC terms, and, hence, all of the input parameters, via the associated posterior probabilities.
  3. In this paper, we investigate hyperelastic and viscoelastic model parameters using Global Sensitivity Analysis(GSA). These models are used to characterize the physical response of many soft-elastomers, which are used in a wide variety of smart material applications. Recent research has shown the effectiveness of using fractional-order calculus operators in modeling the viscoelastic response. The GSA is performed using parameter subset selection (PSS), which quantifies the relative parameter contributions to the linear and nonlinear, fractional-order viscoelastic models. Calibration has been performed to quantify the model parameter uncertainty; however, this analysis has led to questions regarding parameter sensitivity and whether or not the parameters can be uniquely identified given the available data. By performing GSA we can determine which parameters are most influential in the model, and fix non-influential parameters at a nominal value. The model calibration can then be performed to quantify the uncertainty of the influential parameters.
  4. Abstract

    Surface meteorological analyses are an essential input (termed “forcing”) for hydrologic modeling. This study investigated the sensitivity of different hydrologic model configurations to temporal variations of seven forcing variables (precipitation rate, air temperature, longwave radiation, specific humidity, shortwave radiation, wind speed, and air pressure). Specifically, the effects of temporally aggregating hourly forcings to hourly daily average forcings were examined. The analysis was based on 14 hydrological outputs from the Structure for Unifying Multiple Modeling Alternatives (SUMMA) model for the 671 Catchment Attributes and Meteorology for Large-Sample Studies (CAMELS) basins across the contiguous United States (CONUS). Results demonstrated that the hydrologic model sensitivity to temporally aggregating the forcing inputs varies across model output variables and model locations. We used Latin hypercube sampling to sample model parameters from eight combinations of three influential model physics choices (three model decisions with two options for each decision, i.e., eight model configurations). Results showed that the choice of model physics can change the relative influence of forcing on model outputs and the forcing importance may not be dependent on the parameter space. This allows for model output sensitivity to forcing aggregation to be tested prior to parameter calibration. More generally, this work provides amore »comprehensive analysis of the dependence of modeled outcomes on input forcing behavior, providing insight into the regional variability of forcing variable dominance on modeled outputs across CONUS.

    « less
  5. The focus of this paper is on the global sensitivity analysis (GSA) of linear systems with time-invariant model parameter uncertainties and driven by stochastic inputs. The Sobol' indices of the evolving mean and variance estimates of states are used to assess the impact of the time-invariant uncertain model parameters and the statistics of the stochastic input on the uncertainty of the output. Numerical results on two benchmark problems help illustrate that it is conceivable that parameters, which are not so significant in contributing to the uncertainty of the mean, can be extremely significant in contributing to the uncertainty of the variances. The paper uses a polynomial chaos (PC) approach to synthesize a surrogate probabilistic model of the stochastic system after using Lagrange interpolation polynomials (LIPs) as PC bases. The Sobol' indices are then directly evaluated from the PC coefficients. Although this concept is not new, a novel interpretation of stochastic collocation-based PC and intrusive PC is presented where they are shown to represent identical probabilistic models when the system under consideration is linear. This result now permits treating linear models as black boxes to develop intrusive PC surrogates.