e. With recent advances in online sensing technology and high-performance computing, structural health monitoring (SHM) has begun to emerge as an automated approach to the real-time conditional monitoring of civil infrastructure. Ideal SHM strategies detect and characterize damage by leveraging measured response data to update physics-based finite element models (FEMs). When monitoring composite structures, such as reinforced concrete (RC) bridges, the reliability of FEM based SHM is adversely affected by material, boundary, geometric, and other model uncertainties. Civil engineering researchers have adapted popular artificial intelligence (AI) techniques to overcome these limitations, as AI has an innate ability to solve complex and ill-defined problems by leveraging advanced machine learning techniques to rapidly analyze experimental data. In this vein, this study employs a novel Bayesian estimation technique to update a coupled vehicle-bridge FEM for the purposes of SHM. Unlike existing AI based techniques, the proposed approach makes intelligent use of an embedded FEM model, thus reducing the parameter space while simultaneously guiding the Bayesian model via physics-based principles. To validate the method, bridge response data is generated from the vehicle-bridge FEM given a set of “true” parameters and the bias and standard deviation of the parameter estimates are analyzed. Additionally, the mean parameter estimates are used to solve the FEM model and the results are compared against the results obtained for “true” parameter values. A sensitivity study is also conducted to demonstrate methods for properly formulating model spaces to improve the Bayesian estimation routine. The study concludes with a discussion highlighting factors that need to be considered when leveraging experimental data to update FEMs of concrete structures using AI techniques.
more »
« less
Fast etch recipe creation with automated model-based process optimization
A method for automated creation and optimization of multistep etch recipes is presented. Here we demonstrate how an automated model-based process optimization approach can cut the cost and time of recipe creation by 75% or more as compared with traditional experimental design approaches. Underlying the success of the method are reduced-order physics-based models for simulating the process and performing subsequent analysis of the multi dimensional parameter space. SandBox Studio™ AI is used to automate the model selection, model calibration and subsequent process optimization. The process engineer is only required to provide the incoming stack and experimental measurements for model calibration and updates. The method is applied to the optimization of a channel etch for 3D NAND devices. A reduced-order model that captures the physics and chemistry of the multistep reaction is automatically selected and calibrated. A mirror AI model is simultaneously and automatically created to enable nearly instantaneous predictions across the large process space. The AI model is much faster to evaluate and is used to make a Quilt™, a 2D projection of etch performance in the multidimensional process parameter space. A Quilt™ process map is then used to automatically determine the optimal process window to achieve the target CDs.
more »
« less
- Award ID(s):
- 1951245
- PAR ID:
- 10224478
- Editor(s):
- Bannister, Julie; Mohanty, Nihar
- Date Published:
- Journal Name:
- Proc. SPIE 11615, Advanced Etch Technology and Process Integration for Nanopatterning X
- Volume:
- 11615
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Yamashita, Y.; Kano, M. (Ed.)In this work surrogate assisted optimization is utilized to calibrate predictive molecular models, called force fields, used in molecular simulations to reproduce the liquid density of a hydrofluorocarbon refrigerant molecule. A previous calibration workflow which relied on Gaussian process regression models and large Latin hypercube samples to screen force field parameter space is extended to include Bayesian optimization methods to efficiently guide the search for force field parameters. In comparison to the previous work, the Bayesian-based calibration workflow finds a parameter set which results in a lower objective function value than the original workflow after evaluating approximately 50% fewer parameter sets. It is envisioned that this updated workflow will facilitate rapid force field optimization enabling screening of vast molecular design space.more » « less
-
Abstract Climate models are generally calibrated manually by comparing selected climate statistics, such as the global top‐of‐atmosphere energy balance, to observations. The manual tuning only targets a limited subset of observational data and parameters. Bayesian calibration can estimate climate model parameters and their uncertainty using a larger fraction of the available data and automatically exploring the parameter space more broadly. In Bayesian learning, it is natural to exploit the seasonal cycle, which has large amplitude compared with anthropogenic climate change in many climate statistics. In this study, we develop methods for the calibration and uncertainty quantification (UQ) of model parameters exploiting the seasonal cycle, and we demonstrate a proof‐of‐concept with an idealized general circulation model (GCM). UQ is performed using the calibrate‐emulate‐sample approach, which combines stochastic optimization and machine learning emulation to speed up Bayesian learning. The methods are demonstrated in a perfect‐model setting through the calibration and UQ of a convective parameterization in an idealized GCM with a seasonal cycle. Calibration and UQ based on seasonally averaged climate statistics, compared to annually averaged, reduces the calibration error by up to an order of magnitude and narrows the spread of the non‐Gaussian posterior distributions by factors between two and five, depending on the variables used for UQ. The reduction in the spread of the parameter posterior distribution leads to a reduction in the uncertainty of climate model predictions.more » « less
-
Abstract Current research practice for optimizing bioink involves exhaustive experimentation with multi-material composition for determining the printability, shape fidelity and biocompatibility. Predicting bioink properties can be beneficial to the research community but is a challenging task due to the non-Newtonian behavior in complex composition. Existing models such as Cross model become inadequate for predicting the viscosity for heterogeneous composition of bioinks. In this paper, we utilize a machine learning framework to accurately predict the viscosity of heterogeneous bioink compositions, aiming to enhance extrusion-based bioprinting techniques. Utilizing Bayesian optimization (BO), our strategy leverages a limited dataset to inform our model. This is a technique especially useful of the typically sparse data in this domain. Moreover, we have also developed a mask technique that can handle complex constraints, informed by domain expertise, to define the feasible parameter space for the components of the bioink and their interactions. Our proposed method is focused on predicting the intrinsic factor (e.g. viscosity) of the bioink precursor which is tied to the extrinsic property (e.g. cell viability) through the mask function. Through the optimization of the hyperparameter, we strike a balance between exploration of new possibilities and exploitation of known data, a balance crucial for refining our acquisition function. This function then guides the selection of subsequent sampling points within the defined viable space and the process continues until convergence is achieved, indicating that the model has sufficiently explored the parameter space and identified the optimal or near-optimal solutions. Employing this AI-guided BO framework, we have developed, tested, and validated a surrogate model for determining the viscosity of heterogeneous bioink compositions. This data-driven approach significantly reduces the experimental workload required to identify bioink compositions conducive to functional tissue growth. It not only streamlines the process of finding the optimal bioink compositions from a vast array of heterogeneous options but also offers a promising avenue for accelerating advancements in tissue engineering by minimizing the need for extensive experimental trials.more » « less
-
Simulation models of critical systems often have parameters that need to be calibrated using observed data. For expensive simulation models, calibration is done using an emulator of the simulation model built on simulation output at different parameter settings. Using intelligent and adaptive selection of parameters to build the emulator can drastically improve the efficiency of the calibration process. The article proposes a sequential framework with a novel criterion for parameter selection that targets learning the posterior density of the parameters. The emergent behavior from this criterion is that exploration happens by selecting parameters in uncertain posterior regions while simultaneously exploitation happens by selecting parameters in regions of high posterior density. The advantages of the proposed method are illustrated using several simulation experiments and a nuclear physics reaction model.more » « less
An official website of the United States government

