skip to main content


This content will become publicly available on April 24, 2024

Title: A variational Bayesian inference technique for model updating of structural systems with unknown noise statistics
Dynamic models of structural and mechanical systems can be updated to match the measured data through a Bayesian inference process. However, the performance of classical (non-adaptive) Bayesian model updating approaches decreases significantly when the pre-assumed statistical characteristics of the model prediction error are violated. To overcome this issue, this paper presents an adaptive recursive variational Bayesian approach to estimate the statistical characteristics of the prediction error jointly with the unknown model parameters. This approach improves the accuracy and robustness of model updating by including the estimation of model prediction error. The performance of this approach is demonstrated using numerically simulated data obtained from a structural frame with material non-linearity under earthquake excitation. Results show that in the presence of non-stationary noise/error, the non-adaptive approach fails to estimate unknown model parameters, whereas the proposed approach can accurately estimate them.  more » « less
Award ID(s):
1903972
NSF-PAR ID:
10435849
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Frontiers in Built Environment
Volume:
9
ISSN:
2297-3362
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Mechanics-based dynamic models are commonly used in the design and performance assessment of structural systems, and their accuracy can be improved by integrating models with measured data. This paper provides an overview of hierarchical Bayesian model updating which has been recently developed for probabilistic integration of models with measured data, while accounting for different sources of uncertainties and modeling errors. The proposed hierarchical Bayesian framework allows one to explicitly account for pertinent sources of variability such as ambient temperatures and/or excitation amplitudes, as well as modeling errors, and therefore yields more realistic predictions. The paper reports observations from applications of hierarchical approach to three full-scale civil structural systems, namely (1) a footbridge, (2) a 10-story reinforced concrete (RC) building, and (3) a damaged 2-story RC building. The first application highlights the capability of accounting for temperature effects within the hierarchical framework, while the second application underlines the effects of considering bias for prediction error. Finally, the third application considers the effects of excitation amplitude on structural response. The findings underline the importance and capabilities of the hierarchical Bayesian framework for structural identification. Discussions of its advantages and performance over classical deterministic and Bayesian model updating methods are provided. 
    more » « less
  2. Abstract

    Nonlinear response history analysis (NLRHA) is generally considered to be a reliable and robust method to assess the seismic performance of buildings under strong ground motions. While NLRHA is fairly straightforward to evaluate individual structures for a select set of ground motions at a specific building site, it becomes less practical for performing large numbers of analyses to evaluate either (1) multiple models of alternative design realizations with a site‐specific set of ground motions, or (2) individual archetype building models at multiple sites with multiple sets of ground motions. In this regard, surrogate models offer an alternative to running repeated NLRHAs for variable design realizations or ground motions. In this paper, a recently developed surrogate modeling technique, called probabilistic learning on manifolds (PLoM), is presented to estimate structural seismic response. Essentially, the PLoM method provides an efficient stochastic model to develop mappings between random variables, which can then be used to efficiently estimate the structural responses for systems with variations in design/modeling parameters or ground motion characteristics. The PLoM algorithm is introduced and then used in two case studies of 12‐story buildings for estimating probability distributions of structural responses. The first example focuses on the mapping between variable design parameters of a multidegree‐of‐freedom analysis model and its peak story drift and acceleration responses. The second example applies the PLoM technique to estimate structural responses for variations in site‐specific ground motion characteristics. In both examples, training data sets are generated for orthogonal input parameter grids, and test data sets are developed for input parameters with prescribed statistical distributions. Validation studies are performed to examine the accuracy and efficiency of the PLoM models. Overall, both examples show good agreement between the PLoM model estimates and verification data sets. Moreover, in contrast to other common surrogate modeling techniques, the PLoM model is able to preserve correlation structure between peak responses. Parametric studies are conducted to understand the influence of different PLoM tuning parameters on its prediction accuracy.

     
    more » « less
  3. Abstract Particle filters avoid parametric estimates for Bayesian posterior densities, which alleviates Gaussian assumptions in nonlinear regimes. These methods, however, are more sensitive to sampling errors than Gaussian-based techniques such as ensemble Kalman filters. A recent study by the authors introduced an iterative strategy for particle filters that match posterior moments—where iterations improve the filter’s ability to draw samples from non-Gaussian posterior densities. The iterations follow from a factorization of particle weights, providing a natural framework for combining particle filters with alternative filters to mitigate the impact of sampling errors. The current study introduces a novel approach to forming an adaptive hybrid data assimilation methodology, exploiting the theoretical strengths of nonparametric and parametric filters. At each data assimilation cycle, the iterative particle filter performs a sequence of updates while the prior sample distribution is non-Gaussian, then an ensemble Kalman filter provides the final adjustment when Gaussian distributions for marginal quantities are detected. The method employs the Shapiro–Wilk test to determine when to make the transition between filter algorithms, which has outstanding power for detecting departures from normality. Experiments using low-dimensional models demonstrate that the approach has a significant value, especially for nonhomogeneous observation networks and unknown model process errors. Moreover, hybrid factors are extended to consider marginals of more than one collocated variables using a test for multivariate normality. Findings from this study motivate the use of the proposed method for geophysical problems characterized by diverse observation networks and various dynamic instabilities, such as numerical weather prediction models. Significance Statement Data assimilation statistically processes observation errors and model forecast errors to provide optimal initial conditions for the forecast, playing a critical role in numerical weather forecasting. The ensemble Kalman filter, which has been widely adopted and developed in many operational centers, assumes Gaussianity of the prior distribution and solves a linear system of equations, leading to bias in strong nonlinear regimes. On the other hand, particle filters avoid many of those assumptions but are sensitive to sampling errors and are computationally expensive. We propose an adaptive hybrid strategy that combines their advantages and minimizes the disadvantages of the two methods. The hybrid particle filter–ensemble Kalman filter is achieved with the Shapiro–Wilk test to detect the Gaussianity of the ensemble members and determine the timing of the transition between these filter updates. Demonstrations in this study show that the proposed method is advantageous when observations are heterogeneous and when the model has an unknown bias. Furthermore, by extending the statistical hypothesis test to the test for multivariate normality, we consider marginals of more than one collocated variable. These results encourage further testing for real geophysical problems characterized by various dynamic instabilities, such as real numerical weather prediction models. 
    more » « less
  4. Abstract

    Successful modeling of degradation data is of great importance for both accurate reliability assessment and effective maintenance decision‐making. Many of existing degradation performance modeling approaches either assume a homogeneous population of units or characterize a heterogeneous population with some restrictive assumptions, such as pre‐specifying the number of sub‐populations. This paper proposes a Bayesian heterogeneous degradation performance modeling framework to relax the conventional modeling assumptions. Specifically, a Bayesian non‐parametric model formulation and learning algorithm are proposed to characterize the historical degradation data of a heterogeneous population of units with an unknown number of homogeneous sub‐populations and allowing the joint model estimation and sub‐population number identification. Based on the off‐line population‐level model, an on‐line individual‐level degradation model with sequential model updating is further developed to improve remaining useful life prediction of individual units with sparse data. A real case study using the heterogeneous degradation data of deteriorating roads is provided to illustrate the proposed approach and demonstrate its validity.

     
    more » « less
  5. null (Ed.)
    Abstract. We consider the problem of inferring the basal sliding coefficientfield for an uncertain Stokes ice sheet forward model from syntheticsurface velocity measurements. The uncertainty in the forward modelstems from unknown (or uncertain) auxiliary parameters (e.g., rheologyparameters). This inverse problem is posed within the Bayesianframework, which provides a systematic means of quantifyinguncertainty in the solution. To account for the associated modeluncertainty (error), we employ the Bayesian approximation error (BAE)approach to approximately premarginalize simultaneously over both thenoise in measurements and uncertainty in the forward model. We alsocarry out approximative posterior uncertainty quantification based ona linearization of the parameter-to-observable map centered at themaximum a posteriori (MAP) basal sliding coefficient estimate, i.e.,by taking the Laplace approximation. The MAP estimate is found byminimizing the negative log posterior using an inexact Newtonconjugate gradient method. The gradient and Hessian actions to vectorsare efficiently computed using adjoints. Sampling from theapproximate covariance is made tractable by invoking a low-rankapproximation of the data misfit component of the Hessian. We studythe performance of the BAE approach in the context of three numericalexamples in two and three dimensions. For each example, the basalsliding coefficient field is the parameter of primary interest whichwe seek to infer, and the rheology parameters (e.g., the flow ratefactor or the Glen's flow law exponent coefficient field) representso-called nuisance (secondary uncertain) parameters. Our resultsindicate that accounting for model uncertainty stemming from thepresence of nuisance parameters is crucial. Namely our findingssuggest that using nominal values for these parameters, as is oftendone in practice, without taking into account the resulting modelingerror, can lead to overconfident and heavily biased results. We alsoshow that the BAE approach can be used to account for the additionalmodel uncertainty at no additional cost at the online stage. 
    more » « less