Chatter, a self-excited vibration phenomenon, is a critical challenge in high-speed machining operations, affecting tool life, product surface quality, and overall process efficiency. While machine learning models trained on simulated data have shown promise in detecting chatter, their real-world applicability remains uncertain due to discrepancies between simulated and actual machining environments. The primary goal of this study is to bridge the gap between simulation-based machine learning models and real-world applications by developing and validating a Random Forest-based chatter detection system. This research focuses on improving manufacturing efficiency through reliable chatter detection by integrating Operational Modal Analysis (OMA), Receptance Coupling Substructure Analysis (RCSA), and Transfer Learning (TL). The study applies a Random Forest classification model trained on over 140,000 simulated machining datasets, incorporating techniques like Operational Modal Analysis (OMA), Receptance Coupling Substructure Analysis (RCSA), and Transfer Learning (TL) to adapt the model for real-world operational data. The model is validated against 1600 real-world machining datasets, achieving an accuracy of 86.1%, with strong precision and recall scores. The results demonstrate the model’s robustness and potential for practical implementation in industrial settings, highlighting challenges such as sensor noise and variability in machining conditions. This work advances the use of predictive analytics in machining processes, offering a data-driven solution to improve manufacturing efficiency through more reliable chatter detection. 
                        more » 
                        « less   
                    
                            
                            Development of a speed invariant deep learning model with application to condition monitoring of rotating machinery
                        
                    
    
            The application of cutting-edge technologies such as AI, smart sensors, and IoT in factories is revolutionizing the manufacturing industry. This emerging trend, so called smart manufacturing, is a collection of various technologies that support decision-making in real-time in the presence of changing conditions in manufacturing activities; this may advance manufacturing competitiveness and sustainability. As a factory becomes highly automated, physical asset management comes to be a critical part of an operational life-cycle. Maintenance is one area where the collection of technologies may be applied to enhance operational reliability using a machine condition monitoring system. Data-driven models have been extensively applied to machine condition data to build a fault detection system. Most existing studies on fault detection were developed under a fixed set of operating conditions and tested with data obtained from that set of conditions. Therefore, variability in a model’s performance from data obtained from different operating settings is not well reported. There have been limited studies considering changing operational conditions in a data-driven model. For practical applications, a model must identify a targeted fault under variable operational conditions. With this in mind, the goal of this paper is to study invariance of model to changing speed via a deep learning method, which can detect a mechanical imbalance, i.e., targeted fault, under varying speed settings. To study the speed invariance, experimental data obtained from a motor test-bed are processed, and time-series data and time–frequency data are applied to long short-term memory and convolutional neural network, respectively, to evaluate their performance. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1943364
- PAR ID:
- 10323549
- Date Published:
- Journal Name:
- Journal of intelligent manufacturing
- Volume:
- 32
- Issue:
- 2
- ISSN:
- 1572-8145
- Page Range / eLocation ID:
- 393-406
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Abstract The fault diagnosis of bearing in machinery system plays a vital role in ensuring the normal operating performance of system. Machine learning-based fault diagnosis using vibration measurement recently has become a prevailing approach, which aims at identifying the fault through exploring the correlation between the measurement and respective fault. Nevertheless, such correlation will become very complex for the practical scenario where the system is operated under time-varying conditions. To fulfill the reliable bearing fault diagnosis under time-varying condition, this study presents a tailored deep learning model, so called deep long short-term memory (LSTM) network. By fully exploiting the strength of this model in characterizing the temporal dependence of time-series vibration measurement, the negative consequence of time-varying conditions can be minimized, thereby improving the diagnosis performance. The published bearing dataset with various time-varying operating speeds is utilized in case illustrations to validate the effectiveness of proposed methodology.more » « less
- 
            Lu, Xin; Wang, Wei; Wu, Dehao; Li, Xiaoxia (Ed.)In the rapidly evolving landscape of scientific semiconductor laboratories (commonly known as, cleanrooms), integrated with Internet of Things (IoT) technology and Cyber-Physical Systems (CPSs), several factors including operational changes, sensor aging, software updates and the introduction of new processes or equipment can lead to dynamic and non-stationary data distributions in evolving data streams. This phenomenon, known as concept drift, poses a substantial challenge for traditional data-driven digital twin static machine learning (ML) models for anomaly detection and classification. Subsequently, the drift in normal and anomalous data distributions over time causes the model performance to decay, resulting in high false alarm rates and missed anomalies. To address this issue, we present TWIN-ADAPT, a continuous learning model within a digital twin framework designed to dynamically update and optimize its anomaly classification algorithm in response to changing data conditions. This model is evaluated against state-of-the-art concept drift adaptation models and tested under simulated drift scenarios using diverse noise distributions to mimic real-world distribution shift in anomalies. TWIN-ADAPT is applied to three critical CPS datasets of Smart Manufacturing Labs (also known as “Cleanrooms”): Fumehood, Lithography Unit and Vacuum Pump. The evaluation results demonstrate that TWIN-ADAPT’s continual learning model for optimized and adaptive anomaly classification achieves a high accuracy and F1 score of 96.97% and 0.97, respectively, on the Fumehood CPS dataset, showing an average performance improvement of 0.57% over the offline model. For the Lithography and Vacuum Pump datasets, TWIN-ADAPT achieves an average accuracy of 69.26% and 71.92%, respectively, with performance improvements of 75.60% and 10.42% over the offline model. These significant improvements highlight the efficacy of TWIN-ADAPT’s adaptive capabilities. Additionally, TWIN-ADAPT shows a very competitive performance when compared with other benchmark drift adaptation algorithms. This performance demonstrates TWIN-ADAPT’s robustness across different modalities and datasets, confirming its suitability for any IoT-driven CPS framework managing diverse data distributions in real time streams. Its adaptability and effectiveness make it a versatile tool for dynamic industrial settings.more » « less
- 
            Machine learning model and strategy for fast and accurate detection of leaks in water supply networkAbstract The water supply network (WSN) is subjected to leaks that compromise its service to the communities, which, however, is challenging to identify with conventional approaches before the consequences surface. This study developed Machine Learning (ML) models to detect leaks in the WDN. Water pressure data under leaking versus non-leaking conditions were generated with holistic WSN simulation code EPANET considering factors such as the fluctuating user demands, data noise, and the extent of leaks, etc. The results indicate that Artificial Neural Network (ANN), a supervised ML model, can accurately classify leaking versus non-leaking conditions; it, however, requires balanced dataset under both leaking and non-leaking conditions, which is difficult for a real WSN that mostly operate under normal service condition. Autoencoder neural network (AE), an unsupervised ML model, is further developed to detect leak with unbalanced data. The results show AE ML model achieved high accuracy when leaks occur in pipes inside the sensor monitoring area, while the accuracy is compromised otherwise. This observation will provide guidelines to deploy monitoring sensors to cover the desired monitoring area. A novel strategy is proposed based on multiple independent detection attempts to further increase the reliability of leak detection by the AE and is found to significantly reduce the probability of false alarm. The trained AE model and leak detection strategy is further tested on a testbed WSN and achieved promising results. The ML model and leak detection strategy can be readily deployed for in-service WSNs using data obtained with internet-of-things (IoTs) technologies such as smart meters.more » « less
- 
            Network Intrusion Detection in Smart Grids for Imbalanced Attack Types Using Machine Learning ModelsSmart grid has evolved as the next generation power grid paradigm which enables the transfer of real time information between the utility company and the consumer via smart meter and advanced metering infrastructure (AMI). These information facilitate many services for both, such as automatic meter reading, demand side management, and time-of-use (TOU) pricing. However, there have been growing security and privacy concerns over smart grid systems, which are built with both smart and legacy information and operational technologies. Intrusion detection is a critical security service for smart grid systems, alerting the system operator for the presence of ongoing attacks. Hence, there has been lots of research conducted on intrusion detection in the past, especially anomaly-based intrusion detection. Problems emerge when common approaches of pattern recognition are used for imbalanced data which represent much more data instances belonging to normal behaviors than to attack ones, and these approaches cause low detection rates for minority classes. In this paper, we study various machine learning models to overcome this drawback by using CIC-IDS2018 dataset [1].more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    