Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Security concerns have been raised about cascading failure risks in evolving power grids. This paper reveals, for the first time, that the risk of cascading failures can be increased at low network demand levels when considering security-constrained generation dispatch. This occurs because critical transmission cor- ridors become very highly loaded due to the presence of central- ized generation dispatch, e.g., large thermal plants far from de- mand centers. This increased cascading risk is revealed in this work by incorporating security-constrained generation dispatch into the risk assessment and mitigation of cascading failures. A se- curity-constrained AC optimal power flow, which considers eco- nomic functions and security constraints (e.g., network con- straints, 𝑵 − 𝟏 security, and generation margin), is used to pro- vide a representative day-ahead operational plan. Cascading fail- ures are simulated using two simulators, a quasi-steady state DC power flow model, and a dynamic model incorporating all fre- quency-related dynamics, to allow for result comparison and ver- ification. The risk assessment procedure is illustrated using syn- thetic networks of 200 and 2,000 buses. Further, a novel preventive mitigation measure is proposed to first identify critical lines, whose failures are likely to trigger cascading failures, and then to limit power flow through these critical lines during dispatch. Results show that shifting power equivalent to 1% of total demand from critical lines to other lines can reduce cascading risk by up to 80%.more » « lessFree, publicly-accessible full text available March 1, 2025
-
This paper develops a probabilistic earthquake risk assessment for the electric power transmis- sion system in the City of Los Angeles. Via a dc load flow analysis of a suite of damage scenarios that reflect the seismic risk in Los Angeles, we develop a probabilistic representation for load shed during the restoration process. This suite of damage scenarios and their associated annual probabilities of occurrence are developed from 351 risk-adjusted earthquake scenarios using ground motion that collectively represent the seismic risk in Los Angeles at the census tract level. For each of these 351 earthquake scenarios, 12 damage scenarios are developed that form a probabilistic representation of the consequences of the earthquake scenario on the components of the transmission system. This analysis reveals that substation damage is the key driver of load shed. Damage to generators has a substantial but still secondary impact, and damage to transmission lines has significantly less impact. We identify the census tracts that are substantially more vulnerable to power transmission outages during the restoration process. Further, we explore the impact of forecasted increases in penetration of residential storage paired with rooftop solar. The deployment of storage paired with rooftop solar is represented at the census tract level and is assumed to be able to generate and store power for residential demand during the restoration process. The deployment of storage paired with rooftop solar reduces the load shed during the restoration process, but the distribution of this benefit is correlated with household income and whether the dwelling is owned or rented.more » « less
-
It is well known that interdependence between electric power systems and other infrastructures can impact energy reliability and resilience, but it is less clear which particular interactions have the most impact. There is a need for methods that can rank the relative importance of these interdependencies. This paper describes a new tool for measuring resilience and ranking interactions. This tool, known as Computing Resilience of Infrastructure Simulation Platform (CRISP), samples from historical utility data to avoid many of the assumptions required for simulation-based approaches to resilience quantification. This paper applies CRISP to rank the relative importance of four types of interdependence (natural gas supply, communication systems, nuclear generation recovery, and a generic restoration delay) in two test cases: the IEEE 39-bus test case and a 6394-bus model of the New England/New York power grid. The results confirm industry studies suggesting that a loss of the natural gas system is the most severe specific interdependence faced by this region.more » « less
-
We automatically extract resilience events and novel outage and restore processes from standard transmission utility detailed outage data. This new processing is applied to the outage data collected in NERC’s Transmission Availability Data System to introduce and analyze statistics that quantify resilience of the transmission system against extreme weather events. These metrics (such as outage rate and duration, number of elements outaged, rated capacity outaged, restore duration, maximum simultaneous outages, and element-days lost) are calculated for all large weather-related events on the North American transmission system from 2015 to 2020 and then by extreme weather type that caused them such as hurricanes, tornadoes, and winter storms. Finally, we study how performance of the system changed with respect to the resilience metrics by season and year.more » « less
-
Earthquakes cause outages of power transmission system components due to direct physical damage and also through the initiation of cascading processes. This article explores what are the optimal capacity investments to increase the resilience of electric power transmission systems to earthquakes and how those investments change with respect to two issues: (1) the impact of including cascades in the investment optimization model and (2) the impact of focusing more heavily on the early stages of the outages after the earthquake in contrast to more evenly focusing on outages across the entire restoration process. A cascading outage model driven by the statistics of sample utility data is developed and used to locate the cascading lines. We compare the investment plans with and without the modeling of the cascades and with different levels of importance attached to outages that occur during different periods of the restoration process. Using a case study of the Eastern Interconnect transmission grid, where the seismic hazard stems mostly from the New Madrid Seismic Zone, we find that the cascades have little effect on the optimal set of capacity enhancement investments. However, the cascades do have a significant impact on the early stages of the restoration process. Also, the cascading lines can be far away from the initial physically damaged lines. More broadly, the early stages of the earthquake restoration process is affected by the extent of the cascading outages and is critical for search and rescue as well as restoring vital services. Also, we show that an investment plan focusing more heavily on outages in the first 3 days after the earthquake yields fewer outages in the first month, but more outages later in comparison with an investment plan focusing uniformly on outages over an entire 6-month restoration process.