skip to main content


Title: Bayesian Optimization and Hierarchical Forecasting of Non-Weather-Related Electric Power Outages
Power outage prediction is important for planning electric power system response, restoration, and maintenance efforts. It is important for utility managers to understand the impact of outages on the local distribution infrastructure in order to develop appropriate maintenance and resilience measures. Power outage prediction models in literature are often limited in scope, typically tailored to model extreme weather related outage events. While these models are sufficient in predicting widespread outages from adverse weather events, they may fail to capture more frequent, non-weather related outages (NWO). In this study, we explore time series models of NWO by incorporating state-of-the-art techniques that leverage the Prophet model in Bayesian optimization and hierarchical forecasting. After defining a robust metric for NWO (non-weather outage count index, NWOCI), time series forecasting models that leverage advanced preprocessing and forecasting techniques in Kats and Prophet, respectively, were built and tested using six years of daily state- and county-level outage data in Massachusetts (MA). We develop a Prophet model with Bayesian True Parzen Estimator optimization (Prophet-TPE) using state-level outage data and a hierarchical Prophet-Bottom-Up model using county-level data. We find that these forecasting models outperform other Bayesian and hierarchical model combinations of Prophet and Seasonal Autoregressive Integrated Moving Average (SARIMA) models in predicting NWOCI at both county and state levels. Our time series trend decomposition reveals a concerning trend in the growth of NWO in MA. We conclude with a discussion of these observations and possible recommendations for mitigating NWO.  more » « less
Award ID(s):
1940176
NSF-PAR ID:
10392574
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Energies
Volume:
15
Issue:
6
ISSN:
1996-1073
Page Range / eLocation ID:
1958
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Aggregated community-scale data could be harnessed to provide insights into the disparate impacts of managed power outages, burst pipes, and food inaccessibility during extreme weather events. During the winter storm that brought historically low temperatures, snow, and ice to the entire state of Texas in February 2021, Texas power-generating plant operators resorted to rolling blackouts to prevent collapse of the power grid when power demand overwhelmed supply. To reveal the disparate impact of managed power outages on vulnerable subpopulations in Harris County, Texas, which encompasses the city of Houston, we collected and analyzed community-scale big data using statistical and trend classification analyses. The results highlight the spatial and temporal patterns of impacts on vulnerable subpopulations in Harris County. The findings show a significant disparity in the extent and duration of power outages experienced by low-income and minority groups, suggesting the existence of inequality in the management and implementation of the power outage. Also, the extent of burst pipes and disrupted food access, as a proxy for storm impact, were more severe for low-income and minority groups. Insights provided by the results could form a basis from which infrastructure operators might enhance social equality during managed service disruptions in such events. The results and findings demonstrate the value of community-scale big data sources for rapid impact assessment in the aftermath of extreme weather events. 
    more » « less
  2. null (Ed.)
    Recent hurricane events have caused unprecedented amounts of damage on critical infrastructure systems and have severely threatened our public safety and economic health. The most observable (and severe) impact of these hurricanes is the loss of electric power in many regions, which causes breakdowns in essential public services. Understanding power outages and how they evolve during a hurricane provides insights on how to reduce outages in the future, and how to improve the robustness of the underlying critical infrastructure systems. In this article, we propose a novel scalable segmentation with explanations framework to help experts understand such datasets. Our method, CnR (Cut-n-Reveal), first finds a segmentation of the outage sequences based on the temporal variations of the power outage failure process so as to capture major pattern changes. This temporal segmentation procedure is capable of accounting for both the spatial and temporal correlations of the underlying power outage process. We then propose a novel explanation optimization formulation to find an intuitive explanation of the segmentation such that the explanation highlights the culprit time series of the change in each segment. Through extensive experiments, we show that our method consistently outperforms competitors in multiple real datasets with ground truth. We further study real county-level power outage data from several recent hurricanes (Matthew, Harvey, Irma) and show that CnR recovers important, non-trivial, and actionable patterns for domain experts, whereas baselines typically do not give meaningful results. 
    more » « less
  3. Real-time forecasting of non-stationary time series is a challenging problem, especially when the time series evolves rapidly. For such cases, it has been observed that ensemble models consisting of a diverse set of model classes can perform consistently better than individual models. In order to account for the nonstationarity of the data and the lack of availability of training examples, the models are retrained in real-time using the most recent observed data samples. Motivated by the robust performance properties of ensemble models, we developed a Bayesian model averaging ensemble technique consisting of statistical, deep learning, and compartmental models for fore-casting epidemiological signals, specifically, COVID-19 signals. We observed the epidemic dynamics go through several phases (waves). In our ensemble model, we observed that different model classes performed differently during the various phases. Armed with this understanding, in this paper, we propose a modification to the ensembling method to employ this phase information and use different weighting schemes for each phase to produce improved forecasts. However, predicting the phases of such time series is a significant challenge, especially when behavioral and immunological adaptations govern the evolution of the time series. We explore multiple datasets that can serve as leading indicators of trend changes and employ transfer entropy techniques to capture the relevant indicator. We propose a phase prediction algorithm to estimate the phases using the leading indicators. Using the knowledge of the estimated phase, we selectively sample the training data from similar phases. We evaluate our proposed methodology on our currently deployed COVID-19 forecasting model and the COVID-19 ForecastHub models. The overall performance of the proposed model is consistent across the pandemic. More importantly, it is ranked second during two critical rapid growth phases in cases, regimes where the performance of most models from the ForecastHub dropped significantly. 
    more » « less
  4. null (Ed.)
    As the COVID-19 pandemic evolves, reliable prediction plays an important role in policymaking. The classical infectious disease model SEIR (susceptible-exposed-infectious-recovered) is a compact yet simplistic temporal model. The data-driven machine learning models such as RNN (recurrent neural networks) can suffer in case of limited time series data such as COVID-19. In this paper, we combine SEIR and RNN on a graph structure to develop a hybrid spatiotemporal model to achieve both accuracy and efficiency in training and forecasting. We introduce two features on the graph structure: node feature (local temporal infection trend) and edge feature (geographic neighbor effect). For node feature, we derive a discrete recursion (called I-equation) from SEIR so that gradient descend method applies readily to its optimization. For edge feature, we design an RNN model to capture the neighboring effect and regularize the landscape of loss function so that local minima are effective and robust for prediction. The resulting hybrid model (called IeRNN) improves the prediction accuracy on state-level COVID-19 new case data from the US, out-performing standard temporal models (RNN, SEIR, and ARIMA) in 1-day and 7-day ahead forecasting. Our model accommodates various degrees of reopening and provides potential outcomes for policymakers. 
    more » « less
  5. null (Ed.)
    Predicting workload behavior during execution is essential for dynamic resource optimization of processor systems. Early studies used simple prediction algorithms such as a history tables. More recently, researchers have applied advanced machine learning regression techniques. Workload prediction can be cast as a time series forecasting problem. Time series forecasting is an active research area with recent advances that have not been studied in the context of workload prediction. In this paper, we first perform a comparative study of representative time series forecasting techniques to predict the dynamic workload of applications running on a CPU. We adapt state-of-the-art matrix profile and dynamic linear models (DLMs) not previously applied to workload prediction and compare them against traditional SVM and LSTM models that have been popular for handling non-stationary data. We find that all time series forecasting models struggle to predict abrupt workload changes. These changes occur because workloads go through phases, where prior work has studied workload phase detection, classification and prediction. We propose a novel approach that combines time series forecasting with phase prediction. We process each phase as a separate time series and train one forecasting model per phase. At runtime, forecasts from phase-specific models are selected and combined based on the predicted phase behavior. We apply our approach to forecasting of SPEC workloads running on a state-of-the-art Intel machine. Our results show that an LSTM-based phase-aware predictor can forecast workload CPI with less than 8% mean absolute error while reducing CPI error by more than 12% on average compared to a non-phase-aware approach. 
    more » « less