skip to main content


Title: Active learning with generalized sliced inverse regression for high-dimensional reliability analysis
It is computationally expensive to predict reliability using physical models at the design stage if many random input variables exist. This work introduces a dimension reduction technique based on generalized sliced inverse regression (GSIR) to mitigate the curse of dimensionality. The proposed high dimensional reliability method enables active learning to integrate GSIR, Gaussian Process (GP) modeling, and Importance Sampling (IS), resulting in an accurate reliability prediction at a reduced computational cost. The new method consists of three core steps, 1) identification of the importance sampling region, 2) dimension reduction by GSIR to produce a sufficient predictor, and 3) construction of a GP model for the true response with respect to the sufficient predictor in the reduced-dimension space. High accuracy and efficiency are achieved with active learning that is iteratively executed with the above three steps by adding new training points one by one in the region with a high chance of failure.  more » « less
Award ID(s):
1923799
PAR ID:
10358490
Author(s) / Creator(s):
Date Published:
Journal Name:
Structural safety
Volume:
94
ISSN:
1879-3355
Page Range / eLocation ID:
102151
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Reliability analysis is a core element in engineering design and can be performed with physical models (limit-state functions). Reliability analysis becomes computationally expensive when the dimensionality of input random variables is high. This work develops a high-dimensional reliability analysis method through a new dimension reduction strategy so that the contributions of unimportant input variables are also accommodated after dimension reduction. Dimension reduction is performed with the first iteration of the first-order reliability method (FORM), which identifies important and unimportant input variables. Then a higher order reliability analysis is performed in the reduced space of only important input variables. The reliability obtained in the reduced space is then integrated with the contributions of unimportant input variables, resulting in the final reliability prediction that accounts for both types of input variables. Consequently, the new reliability method is more accurate than the traditional method which fixes unimportant input variables at their means. The accuracy is demonstrated by three examples. 
    more » « less
  2. Abstract

    Reliability analysis is usually a core element in engineering design, during which reliability is predicted with physical models (limit-state functions). Reliability analysis becomes computationally expensive when the dimensionality of input random variables is high. This work develops a high dimensional reliability analysis method by a new dimension reduction strategy so that the contributions of both important and unimportant input variables are accommodated by the proposed dimension reduction method. The consideration of the contributions of unimportant input variables can certainly improve the accuracy of the reliability prediction, especially where many unimportant input variables are involved. The dimension reduction is performed with the first iteration of the first order reliability method (FORM), which identifies important and unimportant input variables. Then a higher order reliability analysis, such as the second order reliability analysis and metamodeling method, is performed in the reduced space of only important input variables. The reliability obtained in the reduced space is then integrated with the contributions of unimportant input variables, resulting in the final reliability prediction that accounts for both types of input variables. Consequently, the new reliability method is more accurate than the traditional method, which fixes unimportant input variables at their means. The accuracy is demonstrated by three examples.

     
    more » « less
  3. null (Ed.)
    Abstract In many dimension reduction problems in statistics and machine learning, such as principal component analysis, canonical correlation analysis, independent component analysis, and sufficient dimension reduction, it is important to determine the dimension of the reduced predictor, which often amounts to estimating the rank of a matrix. This problem is called order determination. In this paper, we propose a novel and highly effective order-determination method based on the idea of predictor augmentation. We show that, if we augment the predictor by an artificially generated random vector, then the part of the eigenvectors of the matrix induced by the augmentation display a pattern that reveals information about the order to be determined. This information, when combined with the information provided by the eigenvalues of the matrix, greatly enhances the accuracy of order determination. 
    more » « less
  4. Abstract

    Active learning is a subfield of machine learning that focuses on improving the data collection efficiency in expensive-to-evaluate systems. Active learning-applied surrogate modeling facilitates cost-efficient analysis of demanding engineering systems, while the existence of heterogeneity in underlying systems may adversely affect the performance. In this article, we propose the partitioned active learning that quantifies informativeness of new design points by circumventing heterogeneity in systems. The proposed method partitions the design space based on heterogeneous features and searches for the next design point with two systematic steps. The global searching scheme accelerates exploration by identifying the most uncertain subregion, and the local searching utilizes circumscribed information induced by the local Gaussian process (GP). We also propose Cholesky update-driven numerical remedies for our active learning to address the computational complexity challenge. The proposed method consistently outperforms existing active learning methods in three real-world cases with better prediction and computation time.

     
    more » « less
  5. Summary

    As high dimensional data become routinely available in applied sciences, sufficient dimension reduction has been widely employed and its research has received considerable attention. However, with the majority of sufficient dimension reduction methodology focusing on the dimension reduction step, complete analysis and inference after dimension reduction have yet to receive much attention. We couple the strategy of sufficient dimension reduction with a flexible semiparametric model. We concentrate on inference with respect to the primary variables of interest, and we employ sufficient dimension reduction to bring down the dimension of the regression effectively. Extensive simulations demonstrate the efficacy of the method proposed, and a real data analysis is presented for illustration.

     
    more » « less