skip to main content


Title: Long-term guided wave structural health monitoring in an uncontrolled environment through long short-term principal component analysis
Environmental effects are a significant challenge in guided wave structural health monitoring systems. These effects distort signals and increase the likelihood of false alarms. Many research papers have studied mitigation strategies for common variations in guided wave datasets reproducible in a lab, such as temperature and stress. There are fewer studies and strategies for detecting damage under more unpredictable outdoor conditions. This article proposes a long short-term principal component analysis reconstruction method to detect synthetic damage under highly variational environments, like precipitation, freeze, and other conditions. The method does not require any temperature or other compensation methods and is tested by approximately seven million guided wave measurements collected over 2 years. Results show that our method achieves an area under curve score of near 0.95 when detecting synthetic damage under highly variable environmental conditions.  more » « less
Award ID(s):
1839704
NSF-PAR ID:
10296515
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Structural Health Monitoring
ISSN:
1475-9217
Page Range / eLocation ID:
147592172110355
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This paper studies the effectiveness of joint compression and denoising strategies with realistic, long-term guided wave structural health monitoring data. We leverage the high correlation between nearby collections of guided waves in time to create sparse and low-rank representations. While compression and denoising schemes are not new, they are almost exclusively designed and studied with relatively simple datasets. In contrast, guided wave structural health monitoring datasets have much more complex operational and environmental conditions, such as temperature, that distort data and for which the requirements to achieve effective compression and denoising are not well understood. The paper studies how to optimize our data collection and algorithms to best utilize guided wave data for compression, denoising, and damage detection based on seven million guided wave measurements collected over 2 years.

     
    more » « less
  2. Abstract

    While guided wave structural health monitoring (SHM) is widely researched for ensuring safety, estimating performance deterioration, and detecting damage in structures, it experiences setbacks in accuracy due to varying environmental, sensor, and material factors. To combat these challenges, environmentally variable guided wave data is often stretched with temperature compensation methods, such as the scale transform and optimal signal stretch, to match a baseline signal and enable accurate damage detection. Yet, these methods fail for large environmental changes. This paper addresses this challenge by demonstrating a machine learning method to predict stretch factors. This is accomplished with feed-forward neural networks that approximate the complex velocity change function. We demonstrate that our machine learning approach outperforms the prior art on simulated Lamb wave data and is robust with extreme velocity variations. While our machine learning models do not conduct temperature compensation, their accurate stretch factor predictions serve as a proof of concept that a better model is plausible.

     
    more » « less
  3. Guided ultrasonic wave localization systems use spatially distributed sensor arrays and wave propagation models to detect and locate damage across a structure. Environmental and operational conditions, such as temperature or stress variations, introduce uncertainty into guided wave data and reduce the effectiveness of these localization systems. These uncertainties cause the models used by each localization algorithm to fail to match with reality. This paper addresses this challenge with an ensemble deep neural network that is trained solely with simulated data. Relative to delay-and-sum and matched field processing strategies, this approach is demonstrated to be more robust to temperature variations in experimental data. As a result, this approach demonstrates superior accuracy with small numbers of sensors and greater resilience to spatially nonhomogeneous temperature variations over time.

     
    more » « less
  4. Abstract

    Impedance-based structural health monitoring (SHM) is recognized as a non-intrusive, highly sensitive, and model-independent SHM solution that is readily applicable to complex structures. This SHM method relies on analyzing the electromechanical impedance (EMI) signature of the structure under test over the time span of its operation. Changes in the EMI signature, compared to a baseline measured at the healthy state of the structure, often indicate damage. This method has successfully been applied to assess the integrity of numerous civil, aerospace, and mechanical components and structures. However, EMI sensitivity to environmental conditions, the temperature, in particular, has been an ongoing challenge facing the wide adoption of this method. Temperature-induced variation in EMI signatures can be misinterpreted as damage, leading to false positives, or may overshadow the effects of incipient damage in the structure.

    In this paper, a new method for temperature compensation of EMI signature is presented. Data-driven dynamic models are first developed by fitting EMI signatures measured at various temperatures using the Vector Fitting algorithm. Once these models are developed, the dependence of model parameters on temperature is established. A parametric data-driven model is then derived with temperature as a parameter. This allows for EMI signatures to be calculated at any desired temperature. The capabilities of this new temperature compensation method are demonstrated on aluminum samples, where EMI signatures are measured at various temperatures. The developed method is found to be capable of temperature compensation of EMI signatures at a broad frequency range.

     
    more » « less
  5. Abstract. Advances in ambient environmental monitoring technologies are enabling concerned communities and citizens to collect data to better understand their local environment and potential exposures. These mobile, low-cost tools make it possible to collect data with increased temporal and spatial resolution, providing data on a large scale with unprecedented levels of detail. This type of data has the potential to empower people to make personal decisions about their exposure and support the development of local strategies for reducing pollution and improving health outcomes. However, calibration of these low-cost instruments has been a challenge. Often, a sensor package is calibrated via field calibration. This involves colocating the sensor package with a high-quality reference instrument for an extended period and then applying machine learning or other model fitting technique such as multiple linear regression to develop a calibration model for converting raw sensor signals to pollutant concentrations. Although this method helps to correct for the effects of ambient conditions (e.g., temperature) and cross sensitivities with nontarget pollutants, there is a growing body of evidence that calibration models can overfit to a given location or set of environmental conditions on account of the incidental correlation between pollutant levels and environmental conditions, including diurnal cycles. As a result, a sensor package trained at a field site may provide less reliable data when moved, or transferred, to a different location. This is a potential concern for applications seeking to perform monitoring away from regulatory monitoring sites, such as personal mobile monitoring or high-resolution monitoring of a neighborhood. We performed experiments confirming that transferability is indeed a problem and show that it can be improved by collecting data from multiple regulatory sites and building a calibration model that leverages data from a more diverse data set. We deployed three sensor packages to each of three sites with reference monitors (nine packages total) and then rotated the sensor packages through the sites over time. Two sites were in San Diego, CA, with a third outside of Bakersfield, CA, offering varying environmental conditions, general air quality composition, and pollutant concentrations. When compared to prior single-site calibration, the multisite approach exhibits better model transferability for a range of modeling approaches. Our experiments also reveal that random forest is especially prone to overfitting and confirm prior results that transfer is a significant source of both bias and standard error. Linear regression, on the other hand, although it exhibits relatively high error, does not degrade much in transfer. Bias dominated in our experiments, suggesting that transferability might be easily increased by detecting and correcting for bias. Also, given that many monitoring applications involve the deployment of many sensor packages based on the same sensing technology, there is an opportunity to leverage the availability of multiple sensors at multiple sites during calibration to lower the cost of training and better tolerate transfer. We contribute a new neural network architecture model termed split-NN that splits the model into two stages, in which the first stage corrects for sensor-to-sensor variation and the second stage uses the combined data of all the sensors to build a model for a single sensor package. The split-NN modeling approach outperforms multiple linear regression, traditional two- and four-layer neural networks, and random forest models. Depending on the training configuration, compared to random forest the split-NN method reduced error 0 %–11 % for NO2 and 6 %–13 % for O3. 
    more » « less