skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 8:00 PM ET on Friday, March 21 until 8:00 AM ET on Saturday, March 22 due to maintenance. We apologize for the inconvenience.


Title: Intelligent Feedrate Optimization Using an Uncertainty-Aware Digital Twin Within a Model Predictive Control Framework
The future of intelligent manufacturing machines involves autonomous selection of process parameters to maximize productivity while maintaining quality within specified constraints. To effectively optimize process parameters, these machines need to adapt to existing uncertainties in the physical system. This paper proposes a novel framework and methodology for feedrate optimization that is based on a physics-informed data-driven digital twin with quantified uncertainty. The servo dynamics are modeled using a digital twin, which incorporates the known uncertainty in the physics-based models and predicts the distribution of contour error using a data-driven model that learns the unknown uncertainty on-the-fly by sensor measurements. Using the quantified uncertainty, the proposed feedrate optimization maximizes productivity while maintaining quality under desired servo error constraints and stringency (i.e., the tolerance for constraint violation under uncertainty) using a model predictive control framework. Experimental results obtained using a 3-axis desktop CNC machine tool and a desktop 3D printer demonstrate significant cycle time reductions of up to 38% and 17% respectively, while staying close to the error tolerances compared to the existing methods.  more » « less
Award ID(s):
2054715
PAR ID:
10537850
Author(s) / Creator(s):
; ;
Publisher / Repository:
IEEE
Date Published:
Journal Name:
IEEE Access
Volume:
12
ISSN:
2169-3536
Page Range / eLocation ID:
49947 to 49961
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The presence of various uncertainty sources in metal-based additive manufacturing (AM) process prevents producing AM products with consistently high quality. Using electron beam melting (EBM) of Ti-6Al-4V as an example, this paper presents a data-driven framework for process parameters optimization using physics-informed computer simulation models. The goal is to identify a robust manufacturing condition that allows us to constantly obtain equiaxed materials microstructures under uncertainty. To overcome the computational challenge in the robust design optimization under uncertainty, a two-level data-driven surrogate model is constructed based on the simulation data of a validated high-fidelity multiphysics AM simulation model. The robust design result, indicating a combination of low preheating temperature, low beam power, and intermediate scanning speed, was acquired enabling the repetitive production of equiaxed structure products as demonstrated by physics-based simulations. Global sensitivity analysis at the optimal design point indicates that among the studied six noise factors, specific heat capacity and grain growth activation energy have the largest impact on the microstructure variation. Through this exemplar process optimization, the current study also demonstrates the promising potential of the presented approach in facilitating other complicate AM process optimizations, such as robust designs in terms of porosity control or direct mechanical property control. 
    more » « less
  2. Digital twins are emerging as powerful tools for supporting innovation as well as optimizing the in-service performance of a broad range of complex physical machines, devices, and components. A digital twin is generally designed to provide accurate in-silico representation of the form (i.e., appearance) and the functional response of a specified (unique) physical twin. This paper offers a new perspective on how the emerging concept of digital twins could be applied to accelerate materials innovation efforts. Specifically, it is argued that the material itself can be considered as a highly complex multiscale physical system whose form (i.e., details of the material structure over a hierarchy of material length) and function (i.e., response to external stimuli typically characterized through suitably defined material properties) can be captured suitably in a digital twin. Accordingly, the digital twin can represent the evolution of structure, process, and performance of the material over time, with regard to both process history and in-service environment. This paper establishes the foundational concepts and frameworks needed to formulate and continuously update both the form and function of the digital twin of a selected material physical twin. The form of the proposed material digital twin can be captured effectively using the broadly applicable framework of n-point spatial correlations, while its function at the different length scales can be captured using homogenization and localization process-structure-property surrogate models calibrated to collections of available experimental and physics-based simulation data. 
    more » « less
  3. null (Ed.)
    Servo error pre-compensation (SEP) is commonly used to improve the accuracy of feed drives. Existing SEP approaches often involve the use of physics-based linear models (e.g., transfer functions) to predict servo errors, but suffer from inaccuracies due to unmodeled nonlinear dynamics in feed drives. This paper proposes a linear hybrid model for SEP that combines physics-based and data-driven linear models. The proposed model is shown to approximate nonlinearities unmodeled in physics-based linear models. In experiments on a precision feed drive, the proposed hybrid model improves the accuracy of servo error prediction by up to 38% compared to a physics-based model. 
    more » « less
  4. This work presents an integrated architecture for a prognostic digital twin for smart manufacturing subsystems. The specific case of cutting tool wear (flank wear) in a CNC machine is considered, using benchmark data sets provided by the Prognostics and Health Management (PHM) Society. This paper emphasizes the role of robust uncertainty quantification, especially in the presence of data-driven black- and gray-box dynamic models. A surrogate dynamic model is constructed to track the evolution of flank wear using a reduced set of features extracted from multi-modal sensor time series data. The digital twin's uncertainty quantification engine integrates with this dynamic model along with a machine emulator that is tasked with generating future operating scenarios for the machine. The surrogate dynamic model and emulator are combined in a closed-loop architecture with an adaptive Monte Carlo uncertainty forecasting framework that allows prediction of quantities of interest critical to prognostics within user-prescribed bounds. Numerical results using the PHM dataset are shown illustrating how the adaptive uncertainty forecasting tools deliver a trustworthy forecast by maintaining predictive error within the prescribed tolerance.

     
    more » « less
  5. Optimizing edge caching is crucial for the advancement of next-generation (nextG) wireless networks, ensuring high-speed and low-latency services for mobile users. Existing data-driven optimization approaches often lack awareness of the distribution of random data variables and focus solely on optimizing cache hit rates, neglecting potential reliability concerns, such as base station overload and unbalanced cache issues. This oversight can result in system crashes and degraded user experience. To bridge this gap, we introduce a novel digital twin-assisted optimization framework, called D-REC, which integrates reinforcement learning (RL) with diverse intervention modules to ensure reliable caching in nextG wireless networks. We first develop a joint vertical and horizontal twinning approach to efficiently create network digital twins, which are then employed by D-REC as RL optimizers and safeguards, providing ample datasets for training and predictive evaluation of our cache replacement policy. By incorporating reliability modules into a constrained Markov decision process, D-REC can adaptively adjust actions, rewards, and states to comply with advantageous constraints, minimizing the risk of network failures. Theoretical analysis demonstrates comparable convergence rates between DREC and vanilla data-driven methods without compromising caching performance. Extensive experiments validate that D-REC outperforms conventional approaches in cache hit rate and load balancing while effectively enforcing predetermined reliability intervention modules. 
    more » « less