skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Adaptive importance sampling for extreme quantile estimation with stochastic black box computer models
Abstract Quantile is an important quantity in reliability analysis, as it is related to the resistance level for defining failure events. This study develops a computationally efficient sampling method for estimating extreme quantiles using stochastic black box computer models. Importance sampling has been widely employed as a powerful variance reduction technique to reduce estimation uncertainty and improve computational efficiency in many reliability studies. However, when applied to quantile estimation, importance sampling faces challenges, because a good choice of the importance sampling density relies on information about the unknown quantile. We propose an adaptive method that refines the importance sampling density parameter toward the unknown target quantile value along the iterations. The proposed adaptive scheme allows us to use the simulation outcomes obtained in previous iterations for steering the simulation process to focus on important input areas. We prove some convergence properties of the proposed method and show that our approach can achieve variance reduction over crude Monte Carlo sampling. We demonstrate its estimation efficiency through numerical examples and wind turbine case study.  more » « less
Award ID(s):
1741166
PAR ID:
10456444
Author(s) / Creator(s):
 ;  ;  ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
Naval Research Logistics (NRL)
Volume:
67
Issue:
7
ISSN:
0894-069X
Page Range / eLocation ID:
p. 524-547
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Adaptive random search approaches have been shown to be effective for global optimization problems, where under certain conditions, the expected performance time increases only linearly with dimension. However, previous analyses assume that the objective function can be observed directly. We consider the case where the objective function must be estimated, often using a noisy function, as in simulation. We present a finite-time analysis of algorithm performance that combines estimation with a sampling distribution. We present a framework called Hesitant Adaptive Search with Estimation, and derive an upper bound on function evaluations that is cubic in dimension, under certain conditions. We extend the framework to Quantile Adaptive Search with Estimation, which focuses sampling points from a series of nested quantile level sets. The analyses suggest that computational effort is better expended on sampling improving points than refining estimates of objective function values during the progress of an adaptive search algorithm. 
    more » « less
  2. Abstract In this paper we provide a thorough investigation of the cluster sampling scheme for Morris' elementary effects method (MM), a popular model‐free factor screening method originated in the setting of design and analysis of computational experiments. We first study the sampling mechanism underpinning the two sampling schemes of MM (i.e., cluster sampling and noncluster sampling) and unveil its nature as a two‐level nested sampling process. This in‐depth understanding sets up a foundation for tackling two important aspects of cluster sampling: budget allocation and sampling plan. On the one hand, we study the budget allocation problem for cluster sampling under the analysis of variance framework and derive optimal budget allocations for efficient estimation of the importance measures. On the other hand, we devise an efficient cluster sampling algorithm with two variants to achieve enhanced statistical properties. The numerical evaluations demonstrate the superiority of the proposed cluster sampling algorithm and the budget allocations derived (when used both separately and in conjunction) to existing cluster and noncluster sampling schemes. 
    more » « less
  3. Abstract This paper proposes two sequential metamodel‐based methods for level‐set estimation (LSE) that leverage the uniform bound built on stochastic kriging: predictive variance reduction (PVR) and expected classification improvement (ECI). We show that PVR and ECI possess desirable theoretical performance guarantees and provide closed‐form expressions for their respective sequential sampling criteria to seek the next design point for performing simulation runs, allowing computationally efficient one‐iteration look‐ahead updates. To enhance understanding, we reveal the connection between PVR and ECI's sequential sampling criteria. Additionally, we propose integrating a budget allocation feature with PVR and ECI, which improves computational efficiency and potentially enhances robustness to the impacts of heteroscedasticity. Numerical studies demonstrate the superior performance of the proposed methods compared to state‐of‐the‐art benchmarking approaches when given a fixed simulation budget, highlighting their effectiveness in addressing LSE problems. 
    more » « less
  4. Abstract We address the problem of adaptive minimax density estimation on $$\mathbb{R}^{d}$$ with $$L_{p}$$ loss functions under Huber’s contamination model. To investigate the contamination effect on the optimal estimation of the density, we first establish the minimax rate with the assumption that the density is in an anisotropic Nikol’skii class. We then develop a data-driven bandwidth selection procedure for kernel estimators, which can be viewed as a robust generalization of the Goldenshluger-Lepski method. We show that the proposed bandwidth selection rule can lead to the estimator being minimax adaptive to either the smoothness parameter or the contamination proportion. When both of them are unknown, we prove that finding any minimax-rate adaptive method is impossible. Extensions to smooth contamination cases are also discussed. 
    more » « less
  5. Summary Quantile regression has become a widely used tool for analysing competing risk data. However, quantile regression for competing risk data with a continuous mark is still scarce. The mark variable is an extension of cause of failure in a classical competing risk model where cause of failure is replaced by a continuous mark only observed at uncensored failure times. An example of the continuous mark variable is the genetic distance that measures dissimilarity between the infecting virus and the virus contained in the vaccine construct. In this article, we propose a novel mark-specific quantile regression model. The proposed estimation method borrows strength from data in a neighbourhood of a mark and is based on an induced smoothed estimation equation, which is very different from the existing methods for competing risk data with discrete causes. The asymptotic properties of the resulting estimators are established across mark and quantile continuums. In addition, a mark-specific quantile-type vaccine efficacy is proposed and its statistical inference procedures are developed. Simulation studies are conducted to evaluate the finite sample performances of the proposed estimation and hypothesis testing procedures. An application to the first HIV vaccine efficacy trial is provided. 
    more » « less