skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: REIN: Reliability Estimation via Importance sampling with Normalizing flows
We introduce a novel framework called REIN: Reliability Estimation by learning an Importance sampling (IS) distribution with Normalizing flows (NFs). The NFs learn probability space maps that transform the probability distribution of the input random variables into a quasi-optimal IS distribution. NFs stack together invertible neural networks to construct differentiable bijections with efficiently computed Jacobian determinants. The NF 'pushes forward' a realization from the input probability distribution into a realization from the IS distribution, with importance weights calculated using the change of variables formula. We also propose a loss function to learn a NF map that minimizes the reverse Kullback-Leibler divergence between the 'pushforward' distribution and a sequentially updated target distribution obtained by modifying the optimal IS distribution. We demonstrate REIN's efficacy on a set of benchmark problems that feature very low failure rates, multiple failure modes and high dimensionality, while comparing against other variance reduction methods. We also consider two simple applications, the reliability analyses of a thirty-four story building and a cantilever tube, to demonstrate the applicability of REIN to practical problems of interest. As compared to other methods, REIN is shown to be useful for high-dimensional reliability estimation problems with very small failure probabilities.  more » « less
Award ID(s):
1663667
PAR ID:
10653777
Author(s) / Creator(s):
;
Publisher / Repository:
Elsevier
Date Published:
Journal Name:
Reliability Engineering & System Safety
Volume:
242
Issue:
C
ISSN:
0951-8320
Page Range / eLocation ID:
109729
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The actual failure times of individual components are usually unavailable in many applications. Instead, only aggregate failure-time data are collected by actual users, due to technical and/or economic reasons. When dealing with such data for reliability estimation, practitioners often face the challenges of selecting the underlying failure-time distributions and the corresponding statistical inference methods. So far, only the exponential, normal, gamma and inverse Gaussian distributions have been used in analyzing aggregate failure-time data, due to these distributions having closed-form expressions for such data. However, the limited choices of probability distributions cannot satisfy extensive needs in a variety of engineering applications. PHase-type (PH) distributions are robust and flexible in modeling failure-time data, as they can mimic a large collection of probability distributions of non-negative random variables arbitrarily closely by adjusting the model structures. In this article, PH distributions are utilized, for the first time, in reliability estimation based on aggregate failure-time data. A Maximum Likelihood Estimation (MLE) method and a Bayesian alternative are developed. For the MLE method, an Expectation-Maximization algorithm is developed for parameter estimation, and the corresponding Fisher information is used to construct the confidence intervals for the quantities of interest. For the Bayesian method, a procedure for performing point and interval estimation is also introduced. Numerical examples show that the proposed PH-based reliability estimation methods are quite flexible and alleviate the burden of selecting a probability distribution when the underlying failure-time distribution is general or even unknown. 
    more » « less
  2. Newly restructured generalized polynomial chaos expansion (GPCE) methods for high-dimensional design optimization in the presence of input random variables with arbitrary, dependent probability distributions are reported. The methods feature a dimensionally decomposed GPCE (DD-GPCE) for statistical moment and reliability analyses associated with a high-dimensional stochastic response; a novel synthesis between the DD-GPCE approximation and score functions for estimating the first-order design sensitivities of the statistical moments and failure probability; and a standard gradient-based optimization algorithm, constructing the single-step DD-GPCE and multipoint single-step DD-GPCE (MPSS-DD-GPCE) methods. In these new design methods, the multivariate orthonormal basis functions are assembled consistent with the chosen degree of interaction between input variables and the polynomial order, thus facilitating to deflate the curse of dimensionality to the extent possible. In addition, when coupled with score functions, the DD-GPCE approximation leads to analytical formulae for calculating the design sensitivities. More importantly, the statistical moments, failure probability, and their design sensitivities are determined concurrently from a single stochastic analysis or simulation. Numerical results affirm that the proposed methods yield accurate and computationally efficient optimal solutions of mathematical problems and design solutions for simple mechanical systems. Finally, the success in conducting stochastic shape optimization of a bogie side frame with 41 random variables demonstrates the power of the MPSS-DD-GPCE method in solving industrial-scale engineering design problems. 
    more » « less
  3. Infrastructure networks, such as electrical power grids, transportation and water supply systems, support critical societal functions of society. Failures of such networks can have severe consequences, and quantification of the probability of failure of such systems is essential for understanding and managing their reliability. Analytical and simulation methods have been proposed to solve such kinds of problems, among which sampling methods feature prominently. Recently, the authors extended widely used structural reliability algorithms, subset simulation, cross-entropy-based importance sampling as well as uncertainty quantification methods built from particle integration methods and exact confidence, all for efficient reliability analysis in discrete spaces. This paper tests the performance of these algorithms for static network reliability assessment. In particular, we compare these methods for optimal power flow problems in various IEEE benchmark models. Overall, the cross-entropy-based method outperforms the other methods in all benchmark models except the largest IEEE 300, while the adaptive effort subset simulation and particle integration methods are more suitable for handling high-dimensional problems. By building up the benchmark models, we provide unified examples for comparing different emerging methods in static network reliability assessment and also to support improvement or combination of these methods. 
    more » « less
  4. Failure time data of fielded systems are usually obtained from the actual users of the systems. Due to various operational preferences and/or technical obstacles, a large proportion of field data are collected as aggregate data instead of the exact failure times of individual units. The challenge of using such data is that the obtained information is more concise but less precise in comparison to using individual failure times. The most significant needs in modeling aggregate failure time data are the selection of an appropriate probability distribution and the development of a statistical inference procedure capable of handling data aggregation. Although some probability distributions, such as the Gamma and Inverse Gaussian distributions, have well-known closed-form expressions for the probability density function for aggregate data, the use of such distributions limits the applications in field reliability estimation. For reliability practitioners, it would be invaluable to use a robust approach to handle aggregate failure time data without being limited to a small number of probability distributions. This paper studies the application of phase-type (PH) distribution as a candidate for modeling aggregate failure time data. An expectation-maximization algorithm is developed to obtain the maximum likelihood estimates of model parameters, and the confidence interval for the reliability estimate is also obtained. The simulation and numerical studies show that the robust approach is quite powerful because of the high capability of PH distribution in mimicking a variety of probability distributions. In the area of reliability engineering, there is limited work on modeling aggregate data for field reliability estimation. The analytical and statistical inference methods described in this work provide a robust tool for analyzing aggregate failure time data for the first time. 
    more » « less
  5. The ever-increasing complexity of numerical models and associated computational demands have challenged classical reliability analysis methods. Surrogate model-based reliability analysis techniques, and in particular those using kriging meta-model, have gained considerable attention recently for their ability to achieve high accuracy and computational efficiency. However, existing stopping criteria, which are used to terminate the training of surrogate models, do not directly relate to the error in estimated failure probabilities. This limitation can lead to high computational demands because of unnecessary calls to costly performance functions (e.g., involving finite element models) or potentially inaccurate estimates of failure probability due to premature termination of the training process. Here, we propose the error-based stopping criterion (ESC) to address these limitations. First, it is shown that the total number of wrong sign estimation of the performance function for candidate design samples by kriging, S, follows a Poisson binomial distribution. This finding is subsequently used to estimate the lower and upper bounds of S for a given confidence level for sets of candidate design samples classified by kriging as safe and unsafe. An upper bound of error of the estimated failure probability is subsequently derived according to the probabilistic properties of Poisson binomial distribution. The proposed upper bound is implemented in the kriging-based reliability analysis method as the stopping criterion. The efficiency and robustness of ESC are investigated here using five benchmark reliability analysis problems. Results indicate that the proposed method achieves the set accuracy target and substantially reduces the computational demand, in some cases by over 50%. 
    more » « less