Metal additive manufacturing provides remarkable flexibility in geometry and component design, but localized heating/cooling heterogeneity leads to spatial variations of as-built mechanical properties, significantly complicating the materials design process. To this end, we develop a mechanistic data-driven framework integrating wavelet transforms and convolutional neural networks to predict location-dependent mechanical properties over fabricated parts based on process-induced temperature sequences, i.e., thermal histories. The framework enables multiresolution analysis and importance analysis to reveal dominant mechanistic features underlying the additive manufacturing process, such as critical temperature ranges and fundamental thermal frequencies. We systematically compare the developed approach with other machine learning methods. The results demonstrate that the developed approach achieves reasonably good predictive capability using a small amount of noisy experimental data. It provides a concrete foundation for a revolutionary methodology that predicts spatial and temporal evolution of mechanical properties leveraging domain-specific knowledge and cutting-edge machine and deep learning technologies.
- Award ID(s):
- NSF-PAR ID:
- Date Published:
- Journal Name:
- npj Computational Materials
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
Purpose The purpose of this paper is to develop, apply and validate a mesh-free graph theory–based approach for rapid thermal modeling of the directed energy deposition (DED) additive manufacturing (AM) process. Design/methodology/approach In this study, the authors develop a novel mesh-free graph theory–based approach to predict the thermal history of the DED process. Subsequently, the authors validated the graph theory predicted temperature trends using experimental temperature data for DED of titanium alloy parts (Ti-6Al-4V). Temperature trends were tracked by embedding thermocouples in the substrate. The DED process was simulated using the graph theory approach, and the thermal history predictions were validated based on the data from the thermocouples. Findings The temperature trends predicted by the graph theory approach have mean absolute percentage error of approximately 11% and root mean square error of 23°C when compared to the experimental data. Moreover, the graph theory simulation was obtained within 4 min using desktop computing resources, which is less than the build time of 25 min. By comparison, a finite element–based model required 136 min to converge to similar level of error. Research limitations/implications This study uses data from fixed thermocouples when printing thin-wall DED parts. In the future, the authors will incorporate infrared thermal camera data from large parts. Practical implications The DED process is particularly valuable for near-net shape manufacturing, repair and remanufacturing applications. However, DED parts are often afflicted with flaws, such as cracking and distortion. In DED, flaw formation is largely governed by the intensity and spatial distribution of heat in the part during the process, often referred to as the thermal history. Accordingly, fast and accurate thermal models to predict the thermal history are necessary to understand and preclude flaw formation. Originality/value This paper presents a new mesh-free computational thermal modeling approach based on graph theory (network science) and applies it to DED. The approach eschews the tedious and computationally demanding meshing aspect of finite element modeling and allows rapid simulation of the thermal history in additive manufacturing. Although the graph theory has been applied to thermal modeling of laser powder bed fusion (LPBF), there are distinct phenomenological differences between DED and LPBF that necessitate substantial modifications to the graph theory approach.more » « less
null (Ed.)Despite its potential to overcome the design and processing barriers of traditional subtractive and formative manufacturing techniques, the use of laser powder bed fusion (LPBF) metal additive manufacturing is currently limited due to its tendency to create flaws. A multitude of LPBF-related flaws, such as part-level deformation, cracking, and porosity are linked to the spatiotemporal temperature distribution in the part during the process. The temperature distribution, also called the thermal history, is a function of several factors encompassing material properties, part geometry and orientation, processing parameters, placement of supports, among others. These broad range of factors are difficult and expensive to optimize through empirical testing alone. Consequently, fast and accurate models to predict the thermal history are valuable for mitigating flaw formation in LPBF-processed parts. In our prior works, we developed a graph theory-based approach for predicting the temperature distribution in LPBF parts. This mesh-free approach was compared with both non-proprietary and commercial finite element packages, and the thermal history predictions were experimentally validated with in- situ infrared thermal imaging data. It was found that the graph theory-derived thermal history predictions converged within 30–50% of the time of non-proprietary finite element analysis for a similar level of prediction error. However, these prior efforts were based on small prismatic and cylinder-shaped LPBF parts. In this paper, our objective was to scale the graph theory approach to predict the thermal history of large volume, complex geometry LPBF parts. To realize this objective, we developed and applied three computational strategies to predict the thermal history of a stainless steel (SAE 316L) impeller having outside diameter 155 mm and vertical height 35 mm (700 layers). The impeller was processed on a Renishaw AM250 LPBF system and required 16 h to complete. During the process, in-situ layer-by-layer steady state surface temperature measurements for the impeller were obtained using a calibrated longwave infrared thermal camera. As an example of the outcome, on implementing one of the three strategies reported in this work, which did not reduce or simplify the part geometry, the thermal history of the impeller was predicted with approximate mean absolute error of 6% (standard deviation 0.8%) and root mean square error 23 K (standard deviation 3.7 K). Moreover, the thermal history was simulated within 40 min using desktop computing, which is considerably less than the 16 h required to build the impeller part. Furthermore, the graph theory thermal history predictions were compared with a proprietary LPBF thermal modeling software and non-proprietary finite element simulation. For a similar level of root mean square error (28 K), the graph theory approach converged in 17 min, vs. 4.5 h for non-proprietary finite element analysis.more » « less
Abstract Graphene aerogels (GAs), a special class of 3D graphene assemblies, are well known for their exceptional combination of high strength, lightweightness, and high porosity. However, due to microstructural randomness, the mechanical properties of GAs are also highly stochastic, an issue that has been observed but insufficiently addressed. In this work, we develop Gaussian process metamodels to not only predict important mechanical properties of GAs but also quantify their uncertainties. Using the molecular dynamics simulation technique, GAs are assembled from randomly distributed graphene flakes and spherical inclusions, and are subsequently subject to a quasi-static uniaxial tensile load to deduce mechanical properties. Results show that given the same density, mechanical properties such as the Young’s modulus and the ultimate tensile strength can vary substantially. Treating density, Young’s modulus, and ultimate tensile strength as functions of the inclusion size, and using the simulated GA results as training data, we build Gaussian process metamodels that can efficiently predict the properties of unseen GAs. In addition, statistically valid confidence intervals centered around the predictions are established. This metamodel approach is particularly beneficial when the data acquisition requires expensive experiments or computation, which is the case for GA simulations. The present research quantifies the uncertain mechanical properties of GAs, which may shed light on the statistical analysis of novel nanomaterials of a broad variety.more » « less
Structure–property maps play a key role in accelerated materials discovery. The current norm for developing these maps includes computationally expensive physics‐based simulations. Here, the capabilities of deep learning agents are explored such as convolutional neural networks (CNNs) and multilayer perceptrons (MLPs) to predict structure–property relations and reduce dependence on simulations. This study contains simulated hexagonal boron nitride (h‐BN) microstructures damaged by various levels of radiation and temperature, with the objective to predict their residual strengths from the final atomic positions. By developing low dimensional physical descriptors to statistically describe the defects, these results show that purpose‐specific microstructure representation can help in achieving a good prediction accuracy at low computational cost. Furthermore, the adaptability of the trained deep learning agents is explored to predict structure–property maps of other 2D materials using transfer learning. It is shown that in order to achieve good predictions accuracy (≈95%
R2), an agent that is training for the first time (“learning from scratch”) requires 23–45% of simulated data, whereas an agent adapting to a different material (“transfer learning”) requires only about 10% or less. This suggests that transfer learning is a potential game changer in material discovery and characterization approaches.