This paper presents a survey of the literature on the strategies to enhance the resilience of power systems while shedding lights on the research gaps. Using a deductive methodology on the literature covering the resilience of power systems, we reviewed more than two hundred peer-reviewed articles spanning the 2010–2019 decade. We find that there is vacuum on the level of integration that considers the interdependence of local or decentralized decision making in an adaptive power system. This gap is widened by the absence of policies to enhance resilience in power networks. While there is significant coverage and convergence of research on algorithms for solving the multi-objective problem in optimization routines, there are still uncharted territories on how to incorporate system degradation while designing these self-restoration systems. We posit that a shift to a smarter, cleaner and more resilient power network requires sustained investments rather than disaster-induced responses.
more »
« less
Optimizing microgrid deployment for community resilience
The ability to (re)establish basic community infrastructure and governmental functions, such as medical and communication systems, after the occurrence of a natural disaster rests on a continuous supply of electricity. Traditional energy-generation systems consisting of power plants, transmission lines, and distribution feeders are becoming more vulnerable, given the increasing magnitude and frequency of climate-related natural disasters. We investigate the role that fuel cells, along with other distributed energy resources, play in post-disaster recovery efforts. We present a mixed-integer, non-linear optimization model that takes load and power-technology data as inputs and determines a cost-minimizing design and dispatch strategy while considering operational constraints. The model fails to achieve gaps of less than 15%, on average, after two hours for realistic instances encompassing five technologies and a year-long time horizon at hourly fidelity. Therefore, we devise a multi-phase methodology to expedite solutions, resulting in run times to obtain the best solution in fewer than two minutes; after two hours, we provide proof of near-optimality, i.e., gaps averaging 5%. Solutions obtained from this methodology yield, on average, an 8% decrease in objective function value and utilize fuel cells three times more often than solutions obtained with a straight-forward implementation employing a commercial solver.
more »
« less
- Award ID(s):
- 2053856
- PAR ID:
- 10481150
- Publisher / Repository:
- Optimization and Engineering
- Date Published:
- Journal Name:
- Optimization and Engineering
- ISSN:
- 1389-4420
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Traditional smart meters, which measure energy usage every 15 minutes or more and report it at least a few hours later, lack the granularity needed for real-time decision-making. To address this practical problem, we introduce a new method using generative adversarial networks (GAN) that enforces temporal consistency on its high-resolution outputs via hard inequality constraints using convex optimization. A unique feature of our GAN model is that it is trained solely on slow timescale aggregated historical energy data obtained from smart meters. The results demonstrate that the model can successfully create minute-by-minute temporally correlated profiles of power usage from 15-minute interval average power consumption information. This innovative approach, emphasizing inter-neuron constraints, offers a promising avenue for improved high-speed state estimation in distribution systems and enhances the applicability of data-driven solutions for monitoring and subsequently controlling such systems.more » « less
-
Abstract Direct ethanol fuel cells have been widely investigated as nontoxic and low-corrosive energy conversion devices with high energy and power densities. It is still challenging to develop high-activity and durable catalysts for a complete ethanol oxidation reaction on the anode and accelerated oxygen reduction reaction on the cathode. The materials’ physics and chemistry at the catalytic interface play a vital role in determining the overall performance of the catalysts. Herein, we propose a Pd/Co@N-C catalyst that can be used as a model system to study the synergism and engineering at the solid-solid interface. Particularly, the transformation of amorphous carbon to highly graphitic carbon promoted by cobalt nanoparticles helps achieve the spatial confinement effect, which prevents structural degradation of the catalysts. The strong catalyst-support and electronic effects at the interface between palladium and Co@N-C endow the electron-deficient state of palladium, which enhances the electron transfer and improved activity/durability. The Pd/Co@N-C delivers a maximum power density of 438 mW cm −2 in direct ethanol fuel cells and can be operated stably for more than 1000 hours. This work presents a strategy for the ingenious catalyst structural design that will promote the development of fuel cells and other sustainable energy-related technologies.more » « less
-
The Doppler radar located in Cayey, Puerto Rico is a critical tool in early weather forecasting. During Hurricane Maria in September 2017, the radar was destroyed as the result of the strong winds. An X-band radar was used as a temporary solution. X-band radar have limited range in comparison with Doppler radars. On June 2018, a new Doppler radar was built and forecasting services were fully restored. This paper uses a five-dimensional project management model (5DPM) and complexity maps to identify and manage the sources of complexity in restoring the radar's functionality and maintaining capacity. When looking at the radar individually, it can be concluded that the radar is fully restored. However, rebuilding the radar is different than providing a resilient and sustainable capacity. In order to ensure that the radar remains functional during and after an adverse natural event, ensuring that the radar suffers no damage is not enough. One has to expand the project́s footprint and use a whole systems approach to look at the project within the framework of supporting critical infrastructure, thus increasing the project́s complexity. For example, the radar has a power generator to supply energy in case of an electrical power failure. During a prolonged power failure, the radar may run out of fuel. If the roads and bridges are damaged, access to the site may be blocked, which compromises the radar's functionality. Based on the complexity analysis, it can be concluded that while the reconstruction of the Doppler radar to restore its functionality has finished, ensuring that it maintains capability to adequately warn Puerto Rico residents of weather events during and after a natural disaster still needs to be addressed. Hurricane Maria increased awareness regarding Puerto Rico's critical infrastructure vulnerabilities. The lessons learned from the natural disaster can be used to develop and implement a whole systems approach to design and build resilient and sustainable infrastructure. This paper contributes to the body of knowledge by demonstrating the concept of applying 5DPM to both individual projects and integrated systems. It moves restoration of services from a project specific basis to capacity maintenance mode, which looks at whole systems approach thus expanding the complexity footprint. This global focus ensures that critical infrastructure is resilient and sustainable.more » « less
-
—Exascale computing enables unprecedented, detailed and coupled scientific simulations which generate data on the order of tens of petabytes. Due to large data volumes, lossy compressors become indispensable as they enable better compression ratios and runtime performance than lossless compressors. Moreover, as (high-performance computing) HPC systems grow larger, they draw power on the scale of tens of megawatts. Data motion is expensive in time and energy. Therefore, optimizing compressor and data I/O power usage is an important step in reducing energy consumption to meet sustainable computing goals and stay within limited power budgets. In this paper, we explore efficient power consumption gains for the SZ and ZFP lossy compressors and data writing on a cloud HPC system while varying the CPU frequency, scientific data sets, and system architecture. Using this power consumption data, we construct a power model for lossy compression and present a tuning methodology that reduces energy overhead of lossy compressors and data writing on HPC systems by 14.3% on average. We apply our model and find 6.5 kJs, or 13%, of savings on average for 512GB I/O. Therefore, utilizing our model results in more energy efficient lossy data compression and I/O.more » « less
An official website of the United States government

