skip to main content


Title: Chance-Constrained Multiple Bin Packing Problem with an Application to Operating Room Planning
We study the chance-constrained bin packing problem, with an application to hospital operating room planning. The bin packing problem allocates items of random sizes that follow a discrete distribution to a set of bins with limited capacity, while minimizing the total cost. The bin capacity constraints are satisfied with a given probability. We investigate a big-M and a 0-1 bilinear formulation of this problem. We analyze the bilinear structure of the formulation and use the lifting techniques to identify cover, clique, and projection inequalities to strengthen the formulation. We show that in certain cases these inequalities are facet-defining for a bilinear knapsack constraint that arises in the reformulation. An extensive computational study is conducted for the operating room planning problem that minimizes the number of open operating rooms. The computational tests are performed using problems generated based on real data from a hospital. A lower-bound improvement heuristic is combined with the cuts proposed in this paper in a branch-and-cut framework. The computations illustrate that the techniques developed in this paper can significantly improve the performance of the branch-and-cut method. Problems with up to 1,000 scenarios are solved to optimality in less than an hour. A safe approximation based on conditional value at risk (CVaR) is also solved. The computations show that the CVaR approximation typically leaves a gap of one operating room (e.g., six instead of five) to satisfy the chance constraint. Summary of Contribution: This paper investigates a branch-and-cut algorithm for a chance-constrained bin packing problem with multiple bins. The chance-constrained bin packing provides a modeling framework for applied operations research problems, such as health care, scheduling, and so on. This paper studies alternative computational approaches to solve this problem. Moreover, this paper uses real data from a hospital operating room planning setting as an application to test the algorithmic ideas. This work, therefore, is at the intersection of computing and operations research. Several interesting ideas are developed and studied. These include a strengthened big-M reformulation, analysis of a bilinear reformulation, and identifying certain facet-defining inequalities for this formulation. This paper also gives a lower-bound generation heuristic for a model that minimizes the number of bins. Computational experiments for an operating room planning model that uses data from a hospital demonstrate the computational improvement and importance of the proposed approaches. The techniques proposed in this paper and computational experiments further enhance the interface of computing and operations research.  more » « less
Award ID(s):
1763035
NSF-PAR ID:
10311735
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
INFORMS Journal on Computing
ISSN:
1091-9856
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We study the assignment problem with chance constraints (CAP) and its distributionally robust counterpart DR-CAP. We present a technique for estimating big-M in such a formulation that takes advantage of the ambiguity set. We consider a 0-1 bilinear knapsack set to develop valid inequalities for CAP and DR-CAP. This is generalized to the joint chance constraint problem. A probability cut framework is also developed to solve DR-CAP. A computational study on problem instances obtained from using real hospital surgery data shows that the developed techniques allow us to solve certain model instances and reduce the computational time for others. The use of Wasserstein ambiguity set in the DR-CAP model improves the out-of-sample performance of satisfying the chance constraints more significantly than the one possible by increasing the sample size in the sample average approximation technique. The solution time for DR-CAP model instances is of the same order as that for solving the CAP instances. This finding is important because chance constrained optimization models are very difficult to solve when the coefficients in the constraints are random. 
    more » « less
  2. In a chance constrained program (CCP), decision makers seek the best decision whose probability of violating the uncertainty constraints is within the prespecified risk level. As a CCP is often nonconvex and is difficult to solve to optimality, much effort has been devoted to developing convex inner approximations for a CCP, among which the conditional value-at-risk (CVaR) has been known to be the best for more than a decade. This paper studies and generalizes the ALSO-X, originally proposed by Ahmed, Luedtke, SOng, and Xie in 2017 , for solving a CCP. We first show that the ALSO-X resembles a bilevel optimization, where the upper-level problem is to find the best objective function value and enforce the feasibility of a CCP for a given decision from the lower-level problem, and the lower-level problem is to minimize the expectation of constraint violations subject to the upper bound of the objective function value provided by the upper-level problem. This interpretation motivates us to prove that when uncertain constraints are convex in the decision variables, ALSO-X always outperforms the CVaR approximation. We further show (i) sufficient conditions under which ALSO-X can recover an optimal solution to a CCP; (ii) an equivalent bilinear programming formulation of a CCP, inspiring us to enhance ALSO-X with a convergent alternating minimization method (ALSO-X+); and (iii) an extension of ALSO-X and ALSO-X+ to distributionally robust chance constrained programs (DRCCPs) under the ∞−Wasserstein ambiguity set. Our numerical study demonstrates the effectiveness of the proposed methods. 
    more » « less
  3. The concept of City 2.0 or smart city is offering new opportunities for handling waste management practices. The existing studies have started addressing waste management problems in smart cities mainly by focusing on the design of new sensor-based Internet of Things (IoT) technologies, and optimizing the routes for waste collection trucks with the aim of minimizing operational costs, energy consumption and transportation pollution emissions. In this study, the importance of value recovery from trash bins is highlighted. A stochastic optimization model based on chance-constrained programming is developed to optimize the planning of waste collection operations. The objective of the proposed optimization model is to minimize the total transportation cost while maximizing the recovery of value still embedded in waste bins. The value of collected waste is modeled as an uncertain parameter to reflect the uncertain value that can be recovered from each trash bin due to the uncertain condition and quality of waste. The application of the proposed model is shown by using a numerical example. The study opens new venues for incorporating the value recovery aspect into waste collection planning and development of new data acquisition technologies that enable municipalities to monitor the mix of recyclables embedded in individual trash bins. 
    more » « less
  4. null (Ed.)
    Two nodes of a wireless network may not be able to communicate with each other directly, perhaps because of obstacles or insufficient signal strength. This necessitates the use of intermediate nodes to relay information. Often, one designates a (preferably small) subset of them to relay these messages (i.e., to serve as a virtual backbone for the wireless network), which can be seen as a connected dominating set (CDS) of the associated graph. Ideally, these communication paths should be short, leading to the notion of a latency-constrained CDS. In this paper, we point out several shortcomings of a previously studied formalization of a latency-constrained CDS and propose an alternative one. We introduce an integer programming formulation for the problem that has a variable for each node and imposes the latency constraints via an exponential number of cut-like inequalities. Two nice properties of this formulation are that (1) it applies when distances are hop-based and when they are weighted and (2) it easily generalizes to ensure fault tolerance. We provide a branch-and-cut implementation of this formulation and compare it with a new polynomial-size formulation. Computational experiments demonstrate the superiority of the cut-like formulation. We also study related questions from computational complexity, such as approximation hardness, and answer an open problem regarding the fault diameter of graphs. 
    more » « less
  5. We consider global optimization of nonconvex problems whose factorable reformulations contain a collection of multilinear equations. Important special cases include multilinear and polynomial optimization problems. The multilinear polytope is the convex hull of the set of binary points z satisfying the system of multilinear equations given above. Recently Del Pia and Khajavirad introduced running intersection inequalities, a family of facet-defining inequalities for the multilinear polytope. In this paper we address the separation problem for this class of inequalities. We first prove that separating flower inequalities, a subclass of running intersection inequalities, is NP-hard. Subsequently, for multilinear polytopes of fixed degree, we devise an efficient polynomial-time algorithm for separating running intersection inequalities and embed the proposed cutting-plane generation scheme at every node of the branch-and-reduce global solver BARON. To evaluate the effectiveness of the proposed method we consider two test sets: randomly generated multilinear and polynomial optimization problems of degree three and four, and computer vision instances from an image restoration problem Results show that running intersection cuts significantly improve the performance of BARON and lead to an average CPU time reduction of 50% for the random test set and of 63% for the image restoration test set. 
    more » « less