We present a novel approach aimed at enhancing the efficacy of solving both regular and distributionally robust chance constrained programs using an empirical reference distribution. In general, these programs can be reformulated as mixed-integer programs (MIPs) by introducing binary variables for each scenario, indicating whether a scenario should be satisfied. Whereas existing methods have focused predominantly on either inner or outer approximations, this paper bridges this gap by studying a scheme that effectively combines these approximations via variable fixing. By checking the restricted outer approximations and comparing them with the inner approximations, we derive optimality cuts that can notably reduce the number of binary variables by effectively setting them to either one or zero. We conduct a theoretical analysis of variable fixing techniques, deriving an asymptotic closed-form expression. This expression quantifies the proportion of binary variables that should be optimally fixed to zero. Our empirical results showcase the advantages of our approach in terms of both computational efficiency and solution quality. Notably, we solve all the tested instances from literature to optimality, signifying the robustness and effectiveness of our proposed approach. History: Accepted by Andrea Lodi/Design & Analysis of Algorithms — Discrete. Funding: This work was supported by Office of Naval Research [N00014-24-1-2066]; Division of Civil, Mechanical and Manufacturing Innovation [2246414]. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2023.0299 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2023.0299 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .
more »
« less
Path-Based Formulations for the Design of On-demand Multimodal Transit Systems with Adoption Awareness
This paper reconsiders the On-Demand Multimodal Transit Systems (ODMTS) Design with Adoptions problem (ODMTS-DA) to capture the latent demand in on-demand multimodal transit systems. The ODMTS-DA is a bilevel optimization problem, for which Basciftci and Van Hentenryck proposed an exact combinatorial Benders decomposition. Unfortunately, their proposed algorithm only finds high-quality solutions for medium-sized cities and is not practical for large metropolitan areas. The main contribution of this paper is to propose a new path-based optimization model, called P-Path, to address these computational difficulties. The key idea underlying P-Path is to enumerate two specific sets of paths which capture the essence of the choice model associated with the adoption behavior of riders. With the help of these path sets, the ODMTS-DA can be formulated as a single-level mixed-integer programming model. In addition, the paper presents preprocessing techniques that can reduce the size of the model significantly. P-Path is evaluated on two comprehensive case studies: the midsize transit system of the Ann Arbor – Ypsilanti region in Michigan (which was studied by Basciftci and Van Hentenryck) and the large-scale transit system for the city of Atlanta. The experimental results show that P-Path solves the Michigan ODMTS-DA instances in a few minutes, bringing more than two orders of magnitude improvements compared with the existing approach. For Atlanta, the results show that P-Path can solve large-scale ODMTS-DA instances (about 17 millions variables and 37 millions constraints) optimally in a few hours or in a few days. These results show the tremendous computational benefits of P-Path which provides a scalable approach to the design of on-demand multimodal transit systems with latent demand. History: Accepted by Andrea Lodi, Design & Analysis of Algorithms—Discrete. Funding: This work was partially supported by National Science Foundation Leap-HI [Grant 1854684] and the Tier 1 University Transportation Center (UTC): Transit - Serving Communities Optimally, Responsively, and Efficiently (T-SCORE) from the U.S. Department of Transportation [69A3552047141]. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2023.0014 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2023.0014 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .
more »
« less
- Award ID(s):
- 1854684
- PAR ID:
- 10513092
- Publisher / Repository:
- INFORMS
- Date Published:
- Journal Name:
- INFORMS Journal on Computing
- ISSN:
- 1091-9856
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We consider a risk-averse stochastic capacity planning problem under uncertain demand in each period. Using a scenario tree representation of the uncertainty, we formulate a multistage stochastic integer program to adjust the capacity expansion plan dynamically as more information on the uncertainty is revealed. Specifically, in each stage, a decision maker optimizes capacity acquisition and resource allocation to minimize certain risk measures of maintenance and operational cost. We compare it with a two-stage approach that determines the capacity acquisition for all the periods up front. Using expected conditional risk measures, we derive a tight lower bound and an upper bound for the gaps between the optimal objective values of risk-averse multistage models and their two-stage counterparts. Based on these derived bounds, we present general guidelines on when to solve risk-averse two-stage or multistage models. Furthermore, we propose approximation algorithms to solve the two models more efficiently, which are asymptotically optimal under an expanding market assumption. We conduct numerical studies using randomly generated and real-world instances with diverse sizes, to demonstrate the tightness of the analytical bounds and efficacy of the approximation algorithms. We find that the gaps between risk-averse multistage and two-stage models increase as the variability of the uncertain parameters increases and decrease as the decision maker becomes more risk averse. Moreover, a stagewise-dependent scenario tree attains much higher gaps than a stagewise-independent counterpart, whereas the latter produces tighter analytical bounds. History: Accepted by Andrea Lodi, Area Editor for Design & Analysis of Algorithms–Discrete. Funding: This work of Dr. X. Yu was partially supported by the U.S. National Science Foundation Division of Information and Intelligent Systems [Grant 2331782]. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2023.0396 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2023.0396 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .more » « less
-
Chemotherapy drug administration is a complex problem that often requires expensive clinical trials to evaluate potential regimens; one way to alleviate this burden and better inform future trials is to build reliable models for drug administration. This paper presents a mixed-integer program for combination chemotherapy (utilization of multiple drugs) optimization that incorporates various important operational constraints and, besides dose and concentration limits, controls treatment toxicity based on its effect on the count of white blood cells. To address the uncertainty of tumor heterogeneity, we also propose chance constraints that guarantee reaching an operable tumor size with a high probability in a neoadjuvant setting. We present analytical results pertinent to the accuracy of the model in representing biological processes of chemotherapy and establish its potential for clinical applications through a numerical study of breast cancer. History: Accepted by Paul Brooks, Area Editor for Applications in Biology, Medicine, & Healthcare. Funding: This work was supported by the National Science Foundation [Grants CMMI-1933369 and CMMI-1933373]. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2022.0207 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2022.0207 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .more » « less
-
Nonconvex quadratically constrained programs (QCPs) are generally NP-hard and challenging problems. In this paper, we propose two novel mixed-integer linear programming (MILP) approximations for a nonconvex QCP. Our method begins by utilizing an eigenvalue-based decomposition to express the nonconvex quadratic function as the difference of two convex functions. We then introduce an additional variable to partition each nonconvex constraint into a second-order cone (SOC) constraint and the complement of an SOC constraint. We employ two polyhedral approximation approaches to approximate the SOC constraint. The complement of an SOC constraint is approximated using a combination of linear and complementarity constraints. As a result, we approximate the nonconvex QCP with two linear programs with complementarity constraints (LPCCs). More importantly, we prove that the optimal values of the LPCCs asymptotically converge to that of the original nonconvex QCP. By proving the boundedness of the LPCCs, we further reformulate the LPCCs as MILPs. We demonstrate the effectiveness of our approaches via numerical experiments by applying our proposed approximations to randomly generated instances and two application problems: the joint decision and estimation problem and the two-trust-region subproblem. The numerical results show significant advantages of our approaches in terms of solution quality and computational time compared with existing benchmark approaches. History: Accepted by Pascal Van Hentenryck, Area Editor for Computational Modeling: Methods and Analysis. Funding: K. Pan was supported in part by the Research Grants Council of Hong Kong [Grant 15503723]. J. Cheng and B. Yang were supported in part by the Office of Naval Research [Grant N00014-20-1-2154]. J. Cheng was supported in part by the National Science Foundation [Grant ECCS-2404412]. B. Yang was supported in part by the Air Force Office of Scientific Research [Grant FA9550-23-1-0508]. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2024.0719 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2024.0719 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .more » « less
-
Sparse principal component analysis (SPCA) is designed to enhance the interpretability of traditional principal component analysis by optimally selecting a subset of features that comprise the first principal component. Given the NP-hard nature of SPCA, most current approaches resort to approximate solutions, typically achieved through tractable semidefinite programs or heuristic methods. To solve SPCA to optimality, we propose two exact mixed-integer semidefinite programs (MISDPs) and an arbitrarily equivalent mixed-integer linear program. The MISDPs allow us to design an effective branch-and-cut algorithm with closed-form cuts that do not need to solve dual problems. For the proposed mixed-integer formulations, we further derive the theoretical optimality gaps of their continuous relaxations. Besides, we apply the greedy and local search algorithms to solving SPCA and derive their first-known approximation ratios. Our numerical experiments reveal that the exact methods we developed can efficiently find optimal solutions for data sets containing hundreds of features. Furthermore, our approximation algorithms demonstrate both scalability and near-optimal performance when benchmarked on larger data sets, specifically those with thousands of features. History: Accepted by Andrea Lodi, Area Editor for Design & Analysis of Algorithms—Discrete. Funding: This research was supported in part by the Division of Civil, Mechanical and Manufacturing Innovation [Grant 224614], the Division of Computing and Communication Foundations [Grant 2246417], and the Office of Naval Research [Grant N00014-24-1-2066]. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2022.0372 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2022.0372 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .more » « less
An official website of the United States government

