Abstract PurposeAs a challenging but important optimization problem, the inverse planning for volumetric modulated arc therapy (VMAT) has attracted much research attention. The column generation (CG) type method is so far one of the most effective solution schemes. However, it often relies on simplifications leading to significant gaps between the output and the actual feasible plan. This paper presents a novel column generation (NCG) approach to push the planning results substantially closer to practice. MethodsThe proposed NCG algorithm is equipped with multiple new quality‐enhancing and computation‐facilitating modules as below: (1) Flexible constraints are enabled on both dose rates and treatment time to adapt to machine capabilities as well as planner's preferences, respectively; (2) a cross‐control‐point intermediate aperture simulation is incorporated to better conform to the underlying physics; (3) new pricing and pruning subroutines are adopted to achieve better optimization outputs. To evaluate the effectiveness of this NCG, five VMAT plans, that is, three prostate cases and two head‐and‐neck cases, were computed using proposed NCG. The planning results were compared with those yielded by a historical benchmark planning scheme. ResultsThe NCG generated plans of significantly better quality than the benchmark planning algorithm. For prostate cases, NCG plans satisfied all planning target volume (PTV) criteria whereas CG plans failed on D10% criteria of PTVs for over 9 Gy or more on all cases. For head‐and‐neck cases, again, NCG plans satisfied all PTVs criteria while CG plans failed on D10% criteria of PTVs for over 3 Gy or more on all cases as well as the max dose criteria of both cord and brain stem for over 13 Gy on one case. Moreover, the pruning scheme was found to be effective in enhancing the optimization quality. ConclusionsThe proposed NCG inherits the computational advantages of the traditional CG, while capturing a more realistic characterization of the machine capability and underlying physics. The output solutions of the NCG are substantially closer to practical implementation.
more »
« less
How to Assess Uncertainty-Aware Frameworks for Power System Planning?
Computational advances along with the profound impact of uncertainty on power system investments have motivated the creation of power system planning frameworks that handle long-run uncertainty, large number of alternative plans, and multiple objectives. Planning agencies seek guidance to assess such frameworks. This article addresses this need in two ways. First, we augment previously proposed criteria for assessing planning frameworks by including new criteria such as stakeholder acceptance to make the assessments more comprehensive, while enhancing the practical applicability of assessment criteria by offering criterion-specific themes and questions. Second, using the proposed criteria, we compare two widely used but fundamentally distinct frameworks: an ‘agree-on-plans’ framework, Robust Decision Making (RDM), and an ‘agree-on-assumptions’ framework, centered around Stochastic Programming (SP). By comparing for the first time head-to-head the two distinct frameworks for an electricity supply planning problem under uncertainties in Bangladesh, we conclude that RDM relies on a large number of simulations to provide ample information to decision makers and stakeholders, and to facilitate updating of subjective inputs. In contrast, SP is a highly dimensional optimization problem that identifies plans with relatively good probability-weighted performance in a single step, but even with computational advances remains subject to the curse of dimensionality.
more »
« less
- Award ID(s):
- 2330450
- PAR ID:
- 10558323
- Publisher / Repository:
- IEEE
- Date Published:
- Journal Name:
- IEEE Transactions on Energy Markets, Policy and Regulation
- ISSN:
- 2771-9626
- Page Range / eLocation ID:
- 1 to 13
- Subject(s) / Keyword(s):
- Robust Decision Making, Stochastic Programming, Power System Planning, Deep Uncertainty, Risk Analysis
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Over the past two decades, there has been explosive growth in the application of robust optimization in operations management (robust OM), fueled by both significant advances in optimization theory and a volatile business environment that has led to rising concerns about model uncertainty. We review some common modeling frameworks in robust OM, including the representation of uncertainty and the decision‐making criteria, and sources of model uncertainty that have arisen in the literature, such as demand, supply, and preference. We discuss the successes of robust OM in addressing model uncertainty, enriching decision criteria, generating structural results, and facilitating computation. We also discuss several future research opportunities and challenges.more » « less
-
Power grids based on traditional N-1 design criteria are no longer adequate because these designs do not withstand extreme weather events or cascading failures. Microgrid system has the capability of enhancing grid resilience through defensive or islanded operations in contingency. This paper presents a probabilistic framework for planning resilient distribution system via distributed wind and solar integration. We first define three aspects of resilient distribution system, namely prevention, survivability and recovery. Then we review the distributed generation planning models that comprehend moment estimation, chance constraints and bi-directional power flow. We strive to achieve two objectives: 1) enhancing the grid survivability when distribution lines are damaged or disconnected in the aftermath of disaster attack; and 2) accelerating the recovery of damaged assets through pro-active maintenance and repair services. A simple 9-node network is provided to demonstrate the application of the proposed resilience planning frameworkmore » « less
-
This paper focuses on the motion planning problem for the systems exhibiting both continuous and discrete behaviors, which we refer to as hybrid dynamical systems. First, the motion planning problem for hybrid systems is formulated using the hybrid equation framework, which is general to capture most hybrid systems. Second, a propagation algorithm template is proposed that describes a general framework to solve the motion planning problem for hybrid systems. Third, a rapidly-exploring random trees (RRT) implementation of the proposed algorithm template is designed to solve the motion planning problem for hybrid systems. At each iteration, the proposed algorithm, called HyRRT, randomly picks a state sample and extends the search tree by flow or jump, which is also chosen randomly when both regimes are possible. Through a definition of concatenation of functions defined on hybrid time domains, we show that HyRRT is probabilistically complete, namely, the probability of failing to find a motion plan approaches zero as the number of iterations of the algorithm increases. This property is guaranteed under mild conditions on the data defining the motion plan, which include a relaxation of the usual positive clearance assumption imposed in the literature of classical systems. The motion plan is computed through the solution of two optimization problems, one associated with the flow and the other with the jumps of the system. The proposed algorithm is applied to an actuated bouncing ball system and a walking robot system so as to highlight its generality and computational features.more » « less
-
The rise of machine learning-driven decision-making has sparked a growing emphasis on algorithmic fairness. Within the realm of clustering, the notion of balance is utilized as a criterion for attaining fairness, which characterizes a clustering mechanism as fair when the resulting clusters maintain a consistent proportion of observations representing individuals from distinct groups delineated by protected attributes. Building on this idea, the literature has rapidly incorporated a myriad of extensions, devising fair versions of the existing frequentist clustering algorithms, e.g., k-means, k-medioids, etc., that aim at minimizing specific loss functions. These approaches lack uncertainty quantification associated with the optimal clustering configuration and only provide clustering boundaries without quantifying the probabilities associated with each observation belonging to the different clusters. In this article, we intend to offer a novel probabilistic formulation of the fair clustering problem that facilitates valid uncertainty quantification even under mild model misspecifications, without incurring substantial computational overhead. Mixture model-based fair clustering frameworks facilitate automatic uncertainty quantification, but tend to showcase brittleness under model misspecification and involve significant computational challenges. To circumnavigate such issues, we propose a generalized Bayesian fair clustering framework that inherently enjoys decision-theoretic interpretation. Moreover, we devise efficient computational algorithms that crucially leverage techniques from the existing literature on optimal transport and clustering based on loss functions. The gain from the proposed technology is showcased via numerical experiments and real data examples.more » « less
An official website of the United States government

