skip to main content


Title: Bayesian planning of step‐stress accelerated degradation tests under various optimality criteria
Abstract

Step‐stress accelerated degradation testing (SSADT) has become a common approach to predicting lifetime for highly reliable products that are unlikely to fail in a reasonable time under use conditions or even elevated stress conditions. In literature, the planning of SSADT has been widely investigated for stochastic degradation processes, such as Wiener processes and gamma processes. In this paper, we model the optimal SSADT planning problem from a Bayesian perspective and optimize test plans by determining both stress levels and the allocation of inspections. Large‐sample approximation is used to derive the asymptotic Bayesian utility functions under 3 planning criteria. A revisited LED lamp example is presented to illustrate our method. The comparison with optimal plans from previous studies demonstrates the necessity of considering the stress levels and inspection allocations simultaneously.

 
more » « less
Award ID(s):
1726445
NSF-PAR ID:
10067263
Author(s) / Creator(s):
 ;  ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
Applied Stochastic Models in Business and Industry
Volume:
35
Issue:
3
ISSN:
1524-1904
Page Range / eLocation ID:
p. 537-551
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    The accelerated degradation test (ADT) is an efficient tool for assessing the lifetime information of highly reliable products. However, conducting an ADT is very expensive. Therefore, how to conduct a cost‐constrained ADT plan is a great challenging issue for reliability analysts. By taking the experimental cost into consideration, this paper proposes a semi‐analytical procedure to determine the total sample size, testing stress levels, the measurement frequencies, and the number of measurements (within a degradation path) globally under a class of exponential dispersion degradation models. The proposed method is also extended to determine the global planning of a three‐level compromise plan. The advantage of the proposed method not only provides better design insights for conducting an ADT plan, but also provides an efficient algorithm to obtain a cost‐constrained ADT plan, compared with conventional optimal plans by grid search algorithms.

     
    more » « less
  2. Abstract

    Water supply infrastructure planning in groundwater-dependent regions is often challenged by uncertainty in future groundwater resource availability. Many major aquifer systems face long-term water table decline due to unsustainable withdrawals. However, many regions, especially those in the developing world, have a scarcity of groundwater data. This creates large uncertainties in groundwater resource predictions and decisions about whether to develop alternative supply sources. Developing infrastructure too soon can lead to unnecessary and expensive irreversible investments, but waiting too long can threaten water supply reliability. This study develops an adaptive infrastructure planning framework that applies Bayesian learning on groundwater observations to assess opportunities to learn about groundwater availability in the future and adapt infrastructure plans. This approach allows planners in data scarce regions to assess under what conditions a flexible infrastructure planning approach, in which initial plans are made but infrastructure development is deferred, can mitigate the risk of overbuilding infrastructure while maintaining water supply reliability in the face of uncertainty. This framework connects engineering options analysis from infrastructure planning to groundwater resources modeling. We demonstrate a proof-of-concept on a desalination planning case for the city of Riyadh, Saudi Arabia, where poor characterization of a fossil aquifer creates uncertainty in how long current groundwater resources can reliably supply demand. We find that a flexible planning approach reduces the risk of over-building infrastructure compared to a traditional static planning approach by 40% with minimal reliability risk (<1%). This striking result may be explained by the slow-evolving nature of groundwater decline, which provides time for planners to react, in contrast to more sudden risks such as flooding where tradeoffs between cost and reliability risk are heightened. This Bayesian approach shows promise for many civil infrastructure domains by providing a method to quantify learning in environmental modeling and assess the effectiveness of adaptive planning.

     
    more » « less
  3. Deep neural networks (DNNs) have started to find their role in the modern healthcare system. DNNs are being developed for diagnosis, prognosis, treatment planning, and outcome prediction for various diseases. With the increasing number of applications of DNNs in modern healthcare, their trustworthiness and reliability are becoming increasingly important. An essential aspect of trustworthiness is detecting the performance degradation and failure of deployed DNNs in medical settings. The softmax output values produced by DNNs are not a calibrated measure of model confidence. Softmax probability numbers are generally higher than the actual model confidence. The model confidence-accuracy gap further increases for wrong predictions and noisy inputs. We employ recently proposed Bayesian deep neural networks (BDNNs) to learn uncertainty in the model parameters. These models simultaneously output the predictions and a measure of confidence in the predictions. By testing these models under various noisy conditions, we show that the (learned) predictive confidence is well calibrated. We use these reliable confidence values for monitoring performance degradation and failure detection in DNNs. We propose two different failure detection methods. In the first method, we define a fixed threshold value based on the behavior of the predictive confidence with changing signal-to-noise ratio (SNR) of the test dataset. The second method learns the threshold value with a neural network. The proposed failure detection mechanisms seamlessly abstain from making decisions when the confidence of the BDNN is below the defined threshold and hold the decision for manual review. Resultantly, the accuracy of the models improves on the unseen test samples. We tested our proposed approach on three medical imaging datasets: PathMNIST, DermaMNIST, and OrganAMNIST, under different levels and types of noise. An increase in the noise of the test images increases the number of abstained samples. BDNNs are inherently robust and show more than 10% accuracy improvement with the proposed failure detection methods. The increased number of abstained samples or an abrupt increase in the predictive variance indicates model performance degradation or possible failure. Our work has the potential to improve the trustworthiness of DNNs and enhance user confidence in the model predictions. 
    more » « less
  4. SUMMARY

    Plants respond to low temperatures by altering the mRNA abundance of thousands of genes contributing to numerous physiological and metabolic processes that allow them to adapt. At the post‐transcriptional level, these cold stress‐responsive transcripts undergo alternative splicing, microRNA‐mediated regulation and alternative polyadenylation, amongst others. Recently, m6A, m5C and other mRNA modifications that can affect the regulation and stability of RNA were discovered, thus revealing another layer of post‐transcriptional regulation that plays an important role in modulating gene expression. The importance of m6A in plant growth and development has been appreciated, although its significance under stress conditions is still underexplored. To assess the role of m6A modifications during cold stress responses, methylated RNA immunoprecipitation sequencing was performed in Arabidopsis seedlings esposed to low temperature stress (4°C) for 24 h. This transcriptome‐wide m6A analysis revealed large‐scale shifts in this modification in response to low temperature stress. Because m6A is known to affect transcript stability/degradation and translation, we investigated these possibilities. Interestingly, we found that cold‐enriched m6A‐containing transcripts demonstrated the largest increases in transcript abundance coupled with increased ribosome occupancy under cold stress. The significance of the m6A epitranscriptome on plant cold tolerance was further assessed using themtamutant in which the major m6A methyltransferase gene was mutated. Compared to the wild‐type, along with the differences inCBFsandCORgene expression levels, themtamutant exhibited hypersensitivity to cold treatment as determined by primary root growth, biomass, and reactive oxygen species accumulation. Furthermore, and most importantly, both non‐acclimated and cold‐acclimatedmtamutant demonstrated hypersensitivity to freezing tolerance. Taken together, these findings suggest a critical role for the epitranscriptome in cold tolerance of Arabidopsis.

     
    more » « less
  5. Summary

    A Bayesian framework for group testing under dilution effects has been developed, using lattice-based models. This work has particular relevance given the pressing public health need to enhance testing capacity for coronavirus disease 2019 and future pandemics, and the need for wide-scale and repeated testing for surveillance under constantly varying conditions. The proposed Bayesian approach allows for dilution effects in group testing and for general test response distributions beyond just binary outcomes. It is shown that even under strong dilution effects, an intuitive group testing selection rule that relies on the model order structure, referred to as the Bayesian halving algorithm, has attractive optimal convergence properties. Analogous look-ahead rules that can reduce the number of stages in classification by selecting several pooled tests at a time are proposed and evaluated as well. Group testing is demonstrated to provide great savings over individual testing in the number of tests needed, even for moderately high prevalence levels. However, there is a trade-off with higher number of testing stages, and increased variability. A web-based calculator is introduced to assist in weighing these factors and to guide decisions on when and how to pool under various conditions. High-performance distributed computing methods have also been implemented for considering larger pool sizes, when savings from group testing can be even more dramatic.

     
    more » « less