skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: An Efficient Multifidelity Model for Assessing Risk Probabilities in Power Systems under Rare Events
Risk assessment of power system failures induced by low-frequency, high-impact rare events is of paramount importance to power system planners and operators. In this paper, we develop a cost-effective multi-surrogate method based on multifidelity model for assessing risks in probabilistic power-flow analysis under rare events. Specifically, multiple polynomial-chaos-expansion-based surrogate models are constructed to reproduce power system responses to the stochastic changes of the load and the random occurrence of component outages. These surrogates then propagate a large number of samples at negligible computation cost and thus efficiently screen out the samples associated with high-risk rare events. The results generated by the surrogates, however, may be biased for the samples located in the low-probability tail regions that are critical to power system risk assessment. To resolve this issue, the original high-fidelity power system model is adopted to fine-tune the estimation results of low-fidelity surrogates by reevaluating only a small portion of the samples. This multifidelity model approach greatly improves the computational efficiency of the traditional Monte Carlo method used in computing the risk-event probabilities under rare events without sacrificing computational accuracy.  more » « less
Award ID(s):
1917308 1711191
PAR ID:
10157224
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
53rd Hawaii International Conference on System Sciences 2020
Page Range / eLocation ID:
1-10
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Data assimilation is a Bayesian inference process that obtains an enhanced understanding of a physical system of interest by fusing information from an inexact physics-based model, and from noisy sparse observations of reality. The multifidelity ensemble Kalman filter (MFEnKF) recently developed by the authors combines a full-order physical model and a hierarchy of reduced order surrogate models in order to increase the computational efficiency of data assimilation. The standard MFEnKF uses linear couplings between models, and is statistically optimal in case of Gaussian probability densities. This work extends the MFEnKF into to make use of a broader class of surrogate model such as those based on machine learning methods such as autoencoders non-linear couplings in between the model hierarchies. We identify the right-invertibility property for autoencoders as being a key predictor of success in the forecasting power of autoencoder-based reduced order models. We propose a methodology that allows us to construct reduced order surrogate models that are more accurate than the ones obtained via conventional linear methods. Numerical experiments with the canonical Lorenz'96 model illustrate that nonlinear surrogates perform better than linear projection-based ones in the context of multifidelity ensemble Kalman filtering. We additionality show a large-scale proof-of-concept result with the quasi-geostrophic equations, showing the competitiveness of the method with a traditional reduced order model-based MFEnKF. 
    more » « less
  2. A modern power system is characterized by an increasing penetration of wind power, which results in large uncertainties in its states. These uncertainties must be quantified properly; otherwise, the system security may be threatened. Facing this challenge, we propose a cost-effective, data-driven approach for the probabilistic load-margin assessment problem. Using actual wind data, a kernel-density-estimator is applied to infer the nonparametric wind speed distributions, which are further merged into the framework of a vine copula. The latter enables us to simulate complex multivariate and highly dependent model inputs with a variety of bivariate copulae that precisely represent the tail dependence in the correlated samples. Furthermore, to reduce the prohibitive computational time of the traditional Monte-Carlo simulations processing a large amount of samples, we propose to use a nonparametric, Gaussian-process-emulator-based reduced-order model to replace the original complicated continuation power-flow model through a Bayesian learning framework. To accelerate the convergence rate of this Bayesian algorithm, a truncated polynomial chaos surrogate is developed, which serves as a highly efficient, parametric Bayesian prior. This emulator allows us to execute the time-consuming continuation power-flow solver at the sampled values with a negligible computational cost. Simulation results reveal the impressive performances of the proposed method. 
    more » « less
  3. Facing stochastic variations of the loads due to an increasing penetration of renewable energy generation, online decision making under uncertainty in modern power systems is capturing power researchers' attention in recent years. To address this issue while achieving a good balance between system security and economic objectives, we propose a surrogate-enhanced scheme under a joint chance-constrained (JCC) optimal power-flow (OPF) framework. Starting from a stochastic-sampling procedure, we first utilize the copula theory to simulate the dependence among multivariate uncertain inputs. Then, to reduce the prohibitive computational time required in the traditional Monte-Carlo (MC) method, we propose to use a polynomial-chaos-based surrogate that allows us to efficiently evaluate the power-system model at non-Gaussian distributed sampled values with a negligible computing cost. Learning from the MC simulated samples, we further proposed a hybrid adaptive approach to overcome the conservativeness of the JCC-OPF by utilizing correlation of the system states, which is ignored in the traditional Boole's inequality. The simulations conducted on the modified Illinois test system demonstrate the excellent performance of the proposed method. 
    more » « less
  4. Wildfires pose an escalating risk to communities and infrastructure, especially in regions undergoing increased fuel dryness and temperature extremes driven by climate change, as well as continued expansion into the wildland-urban interface (WUI). Probabilistic wildfire risk assessment provides a rigorous means of quantifying potential impacts, but its application is often hindered by the high computational cost of working with hundreds of thousands of complex wildfire scenarios. This study introduces a novel scenario reduction framework tailored to the unique characteristics of wildfire hazards, which often lack standard intensity metrics and exhibit highly nonlinear, spatially distributed behavior. The proposed framework selects a subset of scenarios that best represent the spatial and statistical diversity of the full dataset, thereby greatly reducing computational costs while accounting for uncertainties. This is achieved by mapping complex wildfire scenarios into a high-dimensional feature space, enabling similarity assessments based on spatial consequence patterns rather than standard intensity metrics. A k-medoids clustering approach is then used to identify a representative subset of scenarios, while an active-learning-based outlier selection procedure incorporates rare but high-impact events without inflating computational demands. The framework was first demonstrated using a simple illustrative example to show how its performance responds to different data characteristics. To further demonstrate the practicality of the framework, it was used for wildfire risk assessment in Spokane County, Washington, where the full dataset (1000 scenarios) was reduced to 41 representative scenarios while preserving the spatial patterns of burn probability and building damage with high fidelity. The results demonstrated that the framework significantly improves computational efficiency and accuracy compared to traditional scenario reduction methods, offering a scalable and flexible tool for probabilistic wildfire risk assessment. 
    more » « less
  5. Abstract In the design of stellarators, energetic particle confinement is a critical point of concern which remains challenging to study from a numerical point of view. Standard Monte Carlo (MC) analyses are highly expensive because a large number of particle trajectories need to be integrated over long time scales, and small time steps must be taken to accurately capture the features of the wide variety of trajectories. Even when they are based on guiding center trajectories, as opposed to full-orbit trajectories, these standard MC studies are too expensive to be included in most stellarator optimization codes. We present the first multifidelity Monte Carlo (MFMC) scheme for accelerating the estimation of energetic particle confinement in stellarators. Our approach relies on a two-level hierarchy, in which a guiding center model serves as the high-fidelity model, and a data-driven linear interpolant is leveraged as the low-fidelity surrogate model. We apply MFMC to the study of energetic particle confinement in a four-period quasi-helically symmetric stellarator, assessing various metrics of confinement. Stemming from the very high computational efficiency of our surrogate model as well as its sufficient correlation to the high-fidelity model, we obtain speedups of up to 10 with MFMC compared to standard MC. 
    more » « less