Electromechanical impedance-based (EMI) techniques using piezoelectric transducers are promising for structural damage identification. They can be implemented in high frequency range with small characteristic wavelengths, leading to high detection sensitivity. The impedance measured is the outcome of harmonic and stationary excitation, which makes it easier to conduct inverse analysis for damage localization and quantification. Nevertheless, the EMI data measurement points are usually limited, thus oftentimes resulting in an under-determined problem. To address this issue, damage identification process can be converted into a multi-objective optimization formulation which naturally yields multiple solutions. While this setup fits the nature of damage identification that a number of possibilities may exist under given observations/measurements, existing algorithms may suffer from premature convergence and entrapment in local extremes. Consequently, the solutions found may not cover the true damage scenario. To tackle these challenges, in this research, a series of local search strategies are tailored to enhance the global searching ability and incorporated into particle swarm-based optimization. The Q-table is utilized to help the algorithm select proper local search strategy based on the maximum Q-table values. Case studies are carried out for verification, and the results show that the proposed memetic algorithm achieves good performance in damage identification.
- NSF-PAR ID:
- 10373945
- Date Published:
- Journal Name:
- Frontiers in Built Environment
- Volume:
- 8
- ISSN:
- 2297-3362
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract -
Structural identification has received increased attention over recent years for performance-based structural assessment and health monitoring. Recently, an approach for formulating the finite element model updating problem as a constraint satisfaction problem has been developed. In contrast to widely used probabilistic model updating through Bayesian inference methods, the technique naturally accounts for measurement and modeling errors through the use of interval arithmetic to determine the set of all feasible solutions to the partially described and incompletely measured inverse eigenvalue problem. This article presents extensions of the constraint satisfaction approach permitting the application to larger multiple degree-of-freedom system models. To accommodate for the drastic increase in the dimensionality of the inverse problem, the extended methodology replaces computation of the complete set of solutions with an approach that contracts the initial search space to the interval hull, which encompasses the complete set of feasible solutions with a single interval vector solution. The capabilities are demonstrated using vibration data acquired through hybrid simulation of a 45-degree-of-freedom planar truss, where a two-bar specimen with bolted connections representing a single member of the truss serves as the experimental substructure. Structural identification is performed using data acquired with the undamaged experimental member as well as over a number of damage scenarios with progressively increased severity developed by exceeding a limit-state capacity of the member. Interval hull solutions obtained through application of the nonlinear constraint satisfaction methodology demonstrate the capability to correctly identify and quantify the extent of the damage in the truss while incorporating measurement uncertainties in the parameter identification.
-
Dynamic coded x-ray tomosynthesis (CXT) uses a set of encoded x-ray sources to interrogate objects lying on a moving conveyor mechanism. The object is reconstructed from the encoded measurements received by the uniform linear array detectors. We propose a multi-objective optimization (MO) method for structured illuminations to balance the reconstruction quality and radiation dose in a dynamic CXT system. The MO framework is established based on a dynamic sensing geometry with binary coding masks. The Strength Pareto Evolutionary Algorithm 2 is used to solve the MO problem by jointly optimizing the coding masks, locations of x-ray sources, and exposure moments. Computational experiments are implemented to assess the proposed MO method. They show that the proposed strategy can obtain a set of Pareto optimal solutions with different levels of radiation dose and better reconstruction quality than the initial setting.
-
The design of machine learning systems often requires trading off different objectives, for example, prediction error and energy consumption for deep neural networks (DNNs). Typically, no single design performs well in all objectives; therefore, finding Pareto-optimal designs is of interest. The search for Pareto-optimal designs involves evaluating designs in an iterative process, and the measurements are used to evaluate an acquisition function that guides the search process. However, measuring different objectives incurs different costs. For example, the cost of measuring the prediction error of DNNs is orders of magnitude higher than that of measuring the energy consumption of a pre-trained DNN as it requires re-training the DNN. Current state-of-the-art methods do not consider this difference in objective evaluation cost, potentially incurring expensive evaluations of objective functions in the optimization process. In this paper, we develop a novel decoupled and cost-aware multi-objective optimization algorithm, which we call Flexible Multi-Objective Bayesian Optimization (FlexiBO) to address this issue. For evaluating each design, FlexiBO selects the objective with higher relative gain by weighting the improvement of the hypervolume of the Pareto region with the measurement cost of each objective. This strategy, therefore, balances the expense of collecting new information with the knowledge gained through objective evaluations, preventing FlexiBO from performing expensive measurements for little to no gain. We evaluate FlexiBO on seven state-of-the-art DNNs for image recognition, natural language processing (NLP), and speech-to-text translation. Our results indicate that, given the same total experimental budget, FlexiBO discovers designs with 4.8% to 12.4% lower hypervolume error than the best method in state-of-the-art multi-objective optimization.more » « less
-
We present ResilienC, a framework for resilient control of Cyber- Physical Systems subject to STL-based requirements. ResilienC uti- lizes a recently developed formalism for specifying CPS resiliency in terms of sets of (rec,dur) real-valued pairs, where rec repre- sents the system’s capability to rapidly recover from a property violation (recoverability), and dur is reflective of its ability to avoid violations post-recovery (durability). We define the resilient STL control problem as one of multi-objective optimization, where the recoverability and durability of the desired STL specification are maximized. When neither objective is prioritized over the other, the solution to the problem is a set of Pareto-optimal system trajectories. We present a precise solution method to the resilient STL control problem using a mixed-integer linear programming encoding and an a posteriori n-constraint approach for efficiently retrieving the complete set of optimally resilient solutions. In ResilienC, at each time-step, the optimal control action selected from the set of Pareto- optimal solutions by a Decision Maker strategy realizes a form of Model Predictive Control. We demonstrate the practical utility of the ResilienC framework on two significant case studies: autonomous vehicle lane keeping and deadline-driven, multi-region package delivery.more » « less