Inference-based optimization via simulation, which substitutes Gaussian process (GP) learning for the structural properties exploited in mathematical programming, is a powerful paradigm that has been shown to be remarkably effective in problems of modest feasible-region size and decision-variable dimension. The limitation to “modest” problems is a result of the computational overhead and numerical challenges encountered in computing the GP conditional (posterior) distribution on each iteration. In this paper, we substantially expand the size of discrete-decision-variable optimization-via-simulation problems that can be attacked in this way by exploiting a particular GP—discrete Gaussian Markov random fields—and carefully tailored computational methods. The result is the rapid Gaussian Markov Improvement Algorithm (rGMIA), an algorithm that delivers both a global convergence guarantee and finite-sample optimality-gap inference for significantly larger problems. Between infrequent evaluations of the global conditional distribution, rGMIA applies the full power of GP learning to rapidly search smaller sets of promising feasible solutions that need not be spatially close. We carefully document the computational savings via complexity analysis and an extensive empirical study. Summary of Contribution: The broad topic of the paper is optimization via simulation, which means optimizing some performance measure of a system that may only be estimated by executing a stochastic, discrete-event simulation. Stochastic simulation is a core topic and method of operations research. The focus of this paper is on significantly speeding-up the computations underlying an existing method that is based on Gaussian process learning, where the underlying Gaussian process is a discrete Gaussian Markov Random Field. This speed-up is accomplished by employing smart computational linear algebra, state-of-the-art algorithms, and a careful divide-and-conquer evaluation strategy. Problems of significantly greater size than any other existing algorithm with similar guarantees can solve are solved as illustrations.
more »
« less
Uniformly efficient simulation for extremes of Gaussian random fields
Abstract In this paper we consider the problem of simultaneously estimating rare-event probabilities for a class of Gaussian random fields. A conventional rare-event simulation method is usually tailored to a specific rare event and consequently would lose estimation efficiency for different events of interest, which often results in additional computational cost in such simultaneous estimation problems. To overcome this issue, we propose a uniformly efficient estimator for a general family of Hölder continuous Gaussian random fields. We establish the asymptotic and uniform efficiency of the proposed method and also conduct simulation studies to illustrate its effectiveness.
more »
« less
- PAR ID:
- 10057907
- Date Published:
- Journal Name:
- Journal of Applied Probability
- Volume:
- 55
- Issue:
- 01
- ISSN:
- 0021-9002
- Page Range / eLocation ID:
- 157 to 178
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We study rare-event simulation for a class of problems where the target hitting sets of interest are defined via modern machine learning tools such as neural networks and random forests. This problem is motivated from fast emerging studies on the safety evaluation of intelligent systems, robustness quantification of learning models, and other potential applications to large-scale simulation in which machine learning tools can be used to approximate complex rare-event set boundaries. We investigate an importance sampling scheme that integrates the dominating point machinery in large deviations and sequential mixed integer programming to locate the underlying dominating points. Our approach works for a range of neural network architectures including fully connected layers, rectified linear units, normalization, pooling and convolutional layers, and random forests built from standard decision trees. We provide efficiency guarantees and numerical demonstration of our approach using a classification model in the UCI Machine Learning Repository.more » « less
-
An Efficient Multifidelity Model for Assessing Risk Probabilities in Power Systems under Rare EventsRisk assessment of power system failures induced by low-frequency, high-impact rare events is of paramount importance to power system planners and operators. In this paper, we develop a cost-effective multi-surrogate method based on multifidelity model for assessing risks in probabilistic power-flow analysis under rare events. Specifically, multiple polynomial-chaos-expansion-based surrogate models are constructed to reproduce power system responses to the stochastic changes of the load and the random occurrence of component outages. These surrogates then propagate a large number of samples at negligible computation cost and thus efficiently screen out the samples associated with high-risk rare events. The results generated by the surrogates, however, may be biased for the samples located in the low-probability tail regions that are critical to power system risk assessment. To resolve this issue, the original high-fidelity power system model is adopted to fine-tune the estimation results of low-fidelity surrogates by reevaluating only a small portion of the samples. This multifidelity model approach greatly improves the computational efficiency of the traditional Monte Carlo method used in computing the risk-event probabilities under rare events without sacrificing computational accuracy.more » « less
-
This paper discusses algorithms for phase retrieval where the measurements follow independent Poisson distributions. We developed an optimization problem based on maximum likelihood estimation (MLE) for the Poisson model and applied Wirtinger flow algorithm to solve it. Simulation results with a random Gaussian sensing matrix and Poisson measurement noise demonstrated that the Wirtinger flow algorithm based on the Poisson model produced higher quality reconstructions than when algorithms derived from Gaussian noise models (Wirtinger flow, Gerchberg Saxton) are applied to such data, with significantly improved computational efficiency.more » « less
-
Inherent vulnerabilities in a cyber network’s constituent machine services can be exploited by malicious agents. As a result, the machines on any network are at risk. Security specialists seek to mitigate the risk of intrusion events through network reconfiguration and defense. When dealing with rare cyber events, high-quality risk estimates using standard simulation approaches may be unattainable, or have significant attached uncertainty, even with a large computational simulation budget. To address this issue, an efficient rare event simulation modeling and analysis technique, namely, importance sampling for cyber networks, is developed. The importance sampling method parametrically amplifies certain aspects of the network in order to cause a rare event to happen more frequently. Output collected under these amplified conditions is then scaled back into the context of the original network to provide meaningful statistical inferences. The importance sampling methodology is tailored to cyber network attacks and takes the attacker’s successes and failures as well as the attacker’s targeting choices into account. The methodology is shown to produce estimates of higher quality than standard simulation with greater computational efficiency.more » « less
An official website of the United States government

