skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: An Efficient Multifidelity Model for Assessing Risk Probabilities in Power Systems under Rare Events
Risk assessment of power system failures induced by low-frequency, high-impact rare events is of paramount importance to power system planners and operators. In this paper, we develop a cost-effective multi-surrogate method based on multifidelity model for assessing risks in probabilistic power-flow analysis under rare events. Specifically, multiple polynomial-chaos-expansion-based surrogate models are constructed to reproduce power system responses to the stochastic changes of the load and the random occurrence of component outages. These surrogates then propagate a large number of samples at negligible computation cost and thus efficiently screen out the samples associated with high-risk rare events. The results generated by the surrogates, however, may be biased for the samples located in the low-probability tail regions that are critical to power system risk assessment. To resolve this issue, the original high-fidelity power system model is adopted to fine-tune the estimation results of low-fidelity surrogates by reevaluating only a small portion of the samples. This multifidelity model approach greatly improves the computational efficiency of the traditional Monte Carlo method used in computing the risk-event probabilities under rare events without sacrificing computational accuracy.  more » « less
Award ID(s):
1917308 1711191
PAR ID:
10157224
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
53rd Hawaii International Conference on System Sciences 2020
Page Range / eLocation ID:
1-10
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Data assimilation is a Bayesian inference process that obtains an enhanced understanding of a physical system of interest by fusing information from an inexact physics-based model, and from noisy sparse observations of reality. The multifidelity ensemble Kalman filter (MFEnKF) recently developed by the authors combines a full-order physical model and a hierarchy of reduced order surrogate models in order to increase the computational efficiency of data assimilation. The standard MFEnKF uses linear couplings between models, and is statistically optimal in case of Gaussian probability densities. This work extends the MFEnKF into to make use of a broader class of surrogate model such as those based on machine learning methods such as autoencoders non-linear couplings in between the model hierarchies. We identify the right-invertibility property for autoencoders as being a key predictor of success in the forecasting power of autoencoder-based reduced order models. We propose a methodology that allows us to construct reduced order surrogate models that are more accurate than the ones obtained via conventional linear methods. Numerical experiments with the canonical Lorenz'96 model illustrate that nonlinear surrogates perform better than linear projection-based ones in the context of multifidelity ensemble Kalman filtering. We additionality show a large-scale proof-of-concept result with the quasi-geostrophic equations, showing the competitiveness of the method with a traditional reduced order model-based MFEnKF. 
    more » « less
  2. Accurate assessment of driver visibility is crucial in automotive design and safety enhancement, particularly in situations where A-pillars obstruct the driver’s field of view. To address this challenge, this research develops a multi-fidelity Gaussian Process (MF-GP) modeling framework to enhance visibility prediction by integrating low-fidelity (LF) image segmentation data with high-fidelity digital human modeling (DHM) simulations. By leveraging a limited set of high-fidelity samples, the proposed MF-GP framework systematically calibrates low-fidelity data to improve predictive accuracy while reducing computational costs. Two A-pillar cutout designs (3.75 cm and 5 cm) were analyzed under varying HF sampling densities of 3%, 7%, and 10%. Results indicate that the 3.75 cm cutout is more sensitive to sparse HF sampling, requiring a denser HF dataset to achieve stable calibration. In contrast, the 5 cm cutout, benefiting from improved LF-HF alignment, achieves comparable accuracy with fewer HF samples. Model validation using root mean square error (RMSE) and coefficient of determination (R2) confirms that increasing HF sampling enhances surrogate model accuracy, with the effect being more pronounced in cases where model performance is susceptible to high-fidelity data. The proposed framework provides a computationally efficient methodology for driver visibility prediction and human-in-the-loop design applications. Future research could explore adaptive HF sampling strategies and ensemble surrogate modeling techniques to further enhance multi-fidelity learning efficiency. 
    more » « less
  3. A modern power system is characterized by an increasing penetration of wind power, which results in large uncertainties in its states. These uncertainties must be quantified properly; otherwise, the system security may be threatened. Facing this challenge, we propose a cost-effective, data-driven approach for the probabilistic load-margin assessment problem. Using actual wind data, a kernel-density-estimator is applied to infer the nonparametric wind speed distributions, which are further merged into the framework of a vine copula. The latter enables us to simulate complex multivariate and highly dependent model inputs with a variety of bivariate copulae that precisely represent the tail dependence in the correlated samples. Furthermore, to reduce the prohibitive computational time of the traditional Monte-Carlo simulations processing a large amount of samples, we propose to use a nonparametric, Gaussian-process-emulator-based reduced-order model to replace the original complicated continuation power-flow model through a Bayesian learning framework. To accelerate the convergence rate of this Bayesian algorithm, a truncated polynomial chaos surrogate is developed, which serves as a highly efficient, parametric Bayesian prior. This emulator allows us to execute the time-consuming continuation power-flow solver at the sampled values with a negligible computational cost. Simulation results reveal the impressive performances of the proposed method. 
    more » « less
  4. Facing stochastic variations of the loads due to an increasing penetration of renewable energy generation, online decision making under uncertainty in modern power systems is capturing power researchers' attention in recent years. To address this issue while achieving a good balance between system security and economic objectives, we propose a surrogate-enhanced scheme under a joint chance-constrained (JCC) optimal power-flow (OPF) framework. Starting from a stochastic-sampling procedure, we first utilize the copula theory to simulate the dependence among multivariate uncertain inputs. Then, to reduce the prohibitive computational time required in the traditional Monte-Carlo (MC) method, we propose to use a polynomial-chaos-based surrogate that allows us to efficiently evaluate the power-system model at non-Gaussian distributed sampled values with a negligible computing cost. Learning from the MC simulated samples, we further proposed a hybrid adaptive approach to overcome the conservativeness of the JCC-OPF by utilizing correlation of the system states, which is ignored in the traditional Boole's inequality. The simulations conducted on the modified Illinois test system demonstrate the excellent performance of the proposed method. 
    more » « less
  5. ABSTRACT We introduce MF-Box, an extended version of MFEmulator, designed as a fast surrogate for power spectra, trained using N-body simulation suites from various box sizes and particle loads. To demonstrate MF-Box’s effectiveness, we design simulation suites that include low-fidelity (LF) suites (L1 and L2) at 256 and $$100 \, \rm {Mpc\, ~}h^{-1}$$, each with 1283 particles, and a high-fidelity (HF) suite with 5123 particles at $$256 \, \rm {Mpc\, ~}h^{-1}$$, representing a higher particle load compared to the LF suites. MF-Box acts as a probabilistic resolution correction function, learning most of the cosmological dependencies from L1 and L2 simulations and rectifying resolution differences with just three HF simulations using a Gaussian process. MF-Box successfully emulates power spectra from our HF testing set with a relative error of $$\lt 3~{{\ \rm per\ cent}}$$ up to $$k \simeq 7 \, h\rm {Mpc}{^{-1}}$$ at z ∈ [0, 3], while maintaining a cost similar to our previous multifidelity approach, which was accurate only up to z = 1. The addition of an extra LF node in a smaller box significantly improves emulation accuracy for MF-Box at $$k \gt 2 \, h\rm {Mpc}{^{-1}}$$, increasing it by a factor of 10. We conduct an error analysis of MF-Box based on computational budget, providing guidance for optimizing budget allocation per fidelity node. Our proposed MF-Box enables future surveys to efficiently combine simulation suites of varying quality, effectively expanding the range of emulation capabilities while ensuring cost efficiency. 
    more » « less