Abstract PurposeAs a challenging but important optimization problem, the inverse planning for volumetric modulated arc therapy (VMAT) has attracted much research attention. The column generation (CG) type method is so far one of the most effective solution schemes. However, it often relies on simplifications leading to significant gaps between the output and the actual feasible plan. This paper presents a novel column generation (NCG) approach to push the planning results substantially closer to practice. MethodsThe proposed NCG algorithm is equipped with multiple new quality‐enhancing and computation‐facilitating modules as below: (1) Flexible constraints are enabled on both dose rates and treatment time to adapt to machine capabilities as well as planner's preferences, respectively; (2) a cross‐control‐point intermediate aperture simulation is incorporated to better conform to the underlying physics; (3) new pricing and pruning subroutines are adopted to achieve better optimization outputs. To evaluate the effectiveness of this NCG, five VMAT plans, that is, three prostate cases and two head‐and‐neck cases, were computed using proposed NCG. The planning results were compared with those yielded by a historical benchmark planning scheme. ResultsThe NCG generated plans of significantly better quality than the benchmark planning algorithm. For prostate cases, NCG plans satisfied all planning target volume (PTV) criteria whereas CG plans failed on D10% criteria of PTVs for over 9 Gy or more on all cases. For head‐and‐neck cases, again, NCG plans satisfied all PTVs criteria while CG plans failed on D10% criteria of PTVs for over 3 Gy or more on all cases as well as the max dose criteria of both cord and brain stem for over 13 Gy on one case. Moreover, the pruning scheme was found to be effective in enhancing the optimization quality. ConclusionsThe proposed NCG inherits the computational advantages of the traditional CG, while capturing a more realistic characterization of the machine capability and underlying physics. The output solutions of the NCG are substantially closer to practical implementation.
more »
« less
How to Assess Uncertainty-Aware Frameworks for Power System Planning?
Computational advances along with the profound impact of uncertainty on power system investments have motivated the creation of power system planning frameworks that handle long-run uncertainty, large number of alternative plans, and multiple objectives. Planning agencies seek guidance to assess such frameworks. This article addresses this need in two ways. First, we augment previously proposed criteria for assessing planning frameworks by including new criteria such as stakeholder acceptance to make the assessments more comprehensive, while enhancing the practical applicability of assessment criteria by offering criterion-specific themes and questions. Second, using the proposed criteria, we compare two widely used but fundamentally distinct frameworks: an ‘agree-on-plans’ framework, Robust Decision Making (RDM), and an ‘agree-on-assumptions’ framework, centered around Stochastic Programming (SP). By comparing for the first time head-to-head the two distinct frameworks for an electricity supply planning problem under uncertainties in Bangladesh, we conclude that RDM relies on a large number of simulations to provide ample information to decision makers and stakeholders, and to facilitate updating of subjective inputs. In contrast, SP is a highly dimensional optimization problem that identifies plans with relatively good probability-weighted performance in a single step, but even with computational advances remains subject to the curse of dimensionality.
more »
« less
- Award ID(s):
- 2330450
- PAR ID:
- 10558323
- Publisher / Repository:
- IEEE
- Date Published:
- Journal Name:
- IEEE Transactions on Energy Markets, Policy and Regulation
- ISSN:
- 2771-9626
- Page Range / eLocation ID:
- 1 to 13
- Subject(s) / Keyword(s):
- Robust Decision Making, Stochastic Programming, Power System Planning, Deep Uncertainty, Risk Analysis
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Over the past two decades, there has been explosive growth in the application of robust optimization in operations management (robust OM), fueled by both significant advances in optimization theory and a volatile business environment that has led to rising concerns about model uncertainty. We review some common modeling frameworks in robust OM, including the representation of uncertainty and the decision‐making criteria, and sources of model uncertainty that have arisen in the literature, such as demand, supply, and preference. We discuss the successes of robust OM in addressing model uncertainty, enriching decision criteria, generating structural results, and facilitating computation. We also discuss several future research opportunities and challenges.more » « less
-
Power grids based on traditional N-1 design criteria are no longer adequate because these designs do not withstand extreme weather events or cascading failures. Microgrid system has the capability of enhancing grid resilience through defensive or islanded operations in contingency. This paper presents a probabilistic framework for planning resilient distribution system via distributed wind and solar integration. We first define three aspects of resilient distribution system, namely prevention, survivability and recovery. Then we review the distributed generation planning models that comprehend moment estimation, chance constraints and bi-directional power flow. We strive to achieve two objectives: 1) enhancing the grid survivability when distribution lines are damaged or disconnected in the aftermath of disaster attack; and 2) accelerating the recovery of damaged assets through pro-active maintenance and repair services. A simple 9-node network is provided to demonstrate the application of the proposed resilience planning frameworkmore » « less
-
The rise of machine learning-driven decision-making has sparked a growing emphasis on algorithmic fairness. Within the realm of clustering, the notion of balance is utilized as a criterion for attaining fairness, which characterizes a clustering mechanism as fair when the resulting clusters maintain a consistent proportion of observations representing individuals from distinct groups delineated by protected attributes. Building on this idea, the literature has rapidly incorporated a myriad of extensions, devising fair versions of the existing frequentist clustering algorithms, e.g., k-means, k-medioids, etc., that aim at minimizing specific loss functions. These approaches lack uncertainty quantification associated with the optimal clustering configuration and only provide clustering boundaries without quantifying the probabilities associated with each observation belonging to the different clusters. In this article, we intend to offer a novel probabilistic formulation of the fair clustering problem that facilitates valid uncertainty quantification even under mild model misspecifications, without incurring substantial computational overhead. Mixture model-based fair clustering frameworks facilitate automatic uncertainty quantification, but tend to showcase brittleness under model misspecification and involve significant computational challenges. To circumnavigate such issues, we propose a generalized Bayesian fair clustering framework that inherently enjoys decision-theoretic interpretation. Moreover, we devise efficient computational algorithms that crucially leverage techniques from the existing literature on optimal transport and clustering based on loss functions. The gain from the proposed technology is showcased via numerical experiments and real data examples.more » « less
-
Abstract Climate change threatens the resource adequacy of future power systems. Existing research and practice lack frameworks for identifying decarbonization pathways that are robust to climate‐related uncertainty. We create such an analytical framework, then use it to assess the robustness of alternative pathways to achieving 60% emissions reductions from 2022 levels by 2040 for the Western U.S. power system. Our framework integrates power system planning and resource adequacy models with 100 climate realizations from a large climate ensemble. Climate realizations drive electricity demand; thermal plant availability; and wind, solar, and hydropower generation. Among five initial decarbonization pathways, all exhibit modest to significant resource adequacy failures under climate realizations in 2040, but certain pathways experience significantly less resource adequacy failures at little additional cost relative to other pathways. By identifying and planning for an extreme climate realization that drives the largest resource adequacy failures across our pathways, we produce a new decarbonization pathway that has no resource adequacy failures under any climate realizations. This new pathway is roughly 5% more expensive than other pathways due to greater capacity investment, and shifts investment from wind to solar and natural gas generators. Our analysis suggests modest increases in investment costs can add significant robustness against climate change in decarbonizing power systems. Our framework can help power system planners adapt to climate change by stress testing future plans to potential climate realizations, and offers a unique bridge between energy system and climate modeling.more » « less
An official website of the United States government

