skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Plausible Optima
We propose a framework and specific algorithms for screening a large (perhaps countably infinite) spaceof feasible solutions to generate a subset containing the optimal solution with high confidence. We attainthis goal even when only a small fraction of the feasible solutions are simulated. To accomplish it weexploit structural information about the space of functions within which the true objective function lies, andthen assess how compatible optimality is for each feasible solution with respect to the observed simulation outputs and the assumed function space. The result is a set of plausible optima. This approach can be viewed as a way to avoid slow simulation by leveraging fast optimization. Explicit formulations of the general approach are provided when the space of functions is either Lipschitz or convex. We establish both small- and large-sample properties of the approach, and provide two numerical examples.  more » « less
Award ID(s):
1634982
PAR ID:
10122983
Author(s) / Creator(s):
Date Published:
Journal Name:
Proceedings of the 2018 Winter Simulation Conference
Page Range / eLocation ID:
1981-1992
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    We consider optimization problems with uncertain constraints that need to be satisfied probabilistically. When data are available, a common method to obtain feasible solutions for such problems is to impose sampled constraints following the so-called scenario optimization approach. However, when the data size is small, the sampled constraints may not statistically support a feasibility guarantee on the obtained solution. This article studies how to leverage parametric information and the power of Monte Carlo simulation to obtain feasible solutions for small-data situations. Our approach makes use of a distributionally robust optimization (DRO) formulation that translates the data size requirement into a Monte Carlo sample size requirement drawn from what we call a generating distribution. We show that, while the optimal choice of this generating distribution is the one eliciting the data or the baseline distribution in a nonparametric divergence-based DRO, it is not necessarily so in the parametric case. Correspondingly, we develop procedures to obtain generating distributions that improve upon these basic choices. We support our findings with several numerical examples. 
    more » « less
  2. This paper develops a finite approximation approach to find a non-smooth solution of an integral equation of the second kind. The equation solutions with non-smooth kernel having a non-smooth solution have never been studied before. Such equations arise frequently when modeling stochastic systems. We construct a Banach space of (right-continuous) distribution functions and reformulate the problem into an operator equation. We provide general necessary and sufficient conditions that allow us to show convergence of the approximation approach developed in this paper. We then provide two specific choices of approximation sequences and show that the properties of these sequences are sufficient to generate approximate equation solutions that converge to the true solution assuming solution uniqueness and some additional mild regularity conditions. Our analysis is performed under the supremum norm, allowing wider applicability of our results. Worst-case error bounds are also available from solving a linear program. We demonstrate the viability and computational performance of our approach by constructing three examples. The solution of the first example can be constructed manually but demonstrates the correctness and convergence of our approach. The second application example involves stationary distribution equations of a stochastic model and demonstrates the dramatic improvement our approach provides over the use of computer simulation. The third example solves a problem involving an everywhere nondifferentiable function for which no closed-form solution is available. 
    more » « less
  3. Feng, B.; Pedrielli, G; Peng, Y.; Shashaani, S.; Song, E.; Corlu, C.; Lee, L.; Chew, E.; Roeder, T.; Lendermann, P. (Ed.)
    Ranking & selection (R&S) procedures are simulation-optimization algorithms for making one-time decisions among a finite set of alternative system designs or feasible solutions with a statistical assurance of a good selection. R&S with covariates (R&S+C) extends the paradigm to allow the optimal selection to depend on contextual information that is obtained just prior to the need for a decision. The dominant approach for solving such problems is to employ offline simulation to create metamodels that predict the performance of each system or feasible solution as a function of the covariate. This paper introduces a fundamentally different approach that solves individual R&S problems offline for various values of the covariate, and then treats the real-time decision as a classification problem: given the covariate information, which system is a good solution? Our approach exploits the availability of efficient R&S procedures, requires milder assumptions than the metamodeling paradigm to provide strong guarantees, and can be more efficient. 
    more » « less
  4. When working with models that allow for many candidate solutions, simulation practitioners can benefit from screening out unacceptable solutions in a statistically controlled way. However, for large solution spaces, estimating the performance of all solutions through simulation can prove impractical. We propose a statistical framework for screening solutions even when only a relatively small subset of them is simulated. Our framework derives its superiority over exhaustive screening approaches by leveraging available properties of the function that describes the performance of solutions. The framework is designed to work with a wide variety of available functional information and provides guarantees on both the confidence and consistency of the resulting screening inference. We provide explicit formulations for the properties of convexity and Lipschitz continuity and show through numerical examples that our procedures can efficiently screen out many unacceptable solutions. 
    more » « less
  5. Singh, M (Ed.)
    Discrete optimization problems arise in many biological contexts and, in many cases, we seek to make inferences from the optimal solutions. However, the number of optimal solutions is frequently very large and making inferences from any single solution may result in conclusions that are not supported by other optimal solutions. We describe a general approach for efficiently (polynomial time) and exactly (without sampling) computing statistics on the space of optimal solutions. These statistics provide insights into the space of optimal solutions that can be used to support the use of a single optimum (e.g., when the optimal solutions are similar) or justify the need for selecting multiple optima (e.g., when the solution space is large and diverse) from which to make inferences. We demonstrate this approach on two well-known problems and identify the properties of these problems that make them amenable to this method. 
    more » « less