skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Navigating Uncertainty: Ambiguity Quantification in Fingerprinting-based Indoor Localization
In this paper, we present a conformal prediction (CP) based method to evaluate the performance of a finger-printing localization system through uncertainty quantification. The proposed method emphasizes a standalone module that is compatible with any well-trained fingerprint classifier without incurring extra training costs. It provides rigorous statistical guarantees for revealing true labels in the fingerprinting multi-class classification problems with high efficiency. Uncertainty quantification of the predictions is accomplished by leveraging a small calibration dataset and a given error tolerance level. Three specific metrics are introduced to quantify the uncertainty of the CP-based method from the perspective of efficiency, adaptivity, and accuracy, respectively. The proposed method allows developers to track the model state with minimal effort and evaluate the reliability of their model and measurements, such as in a dynamic environment. The proposed technique, therefore, prevents the intrinsic label inaccuracy and the additional labor cost of ground truth collection. We evaluate the proposed method and metrics in two representative indoor environments using vanilla fingerprint-based localization models with extensive experiments. Our experimental results show that the proposed method can successfully quantify the uncertainty of predictions.  more » « less
Award ID(s):
2245607
PAR ID:
10598003
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
IEEE
Date Published:
ISBN:
979-8-3503-9229-6
Page Range / eLocation ID:
123 to 128
Subject(s) / Keyword(s):
Chanel State Information (CSI), Conformal Prediction (CP), Indoor localization, Uncertainty measurement
Format(s):
Medium: X
Location:
Melbourne, Australia
Sponsoring Org:
National Science Foundation
More Like this
  1. The uncertainty quantification of prediction mod- els (e.g., neural networks) is crucial for their adoption in many robotics applications. This is arguably as important as making accurate predictions, especially for safety-critical applications such as self-driving cars. This paper proposes our approach to uncertainty quantification in the context of visual localization for autonomous driving, where we predict locations from images. Our proposed framework estimates probabilistic uncertainty by creating a sensor error model that maps an inter- nal output of the prediction model to the uncertainty. The sensor error model is created using multiple image databases of visual localization, each with ground-truth location. We demonstrate the accuracy of our uncertainty prediction framework using the Ithaca365 dataset, which includes variations in lighting, weather (sunny, snowy, night), and alignment errors between databases. We analyze both the predicted uncertainty and its incorporation into a Kalman-based localization filter. Our results show that prediction error variations increase with poor weather and lighting condition, leading to greater uncertainty and outliers, which can be predicted by our proposed uncertainty model. Additionally, our probabilistic error model enables the filter to remove ad hoc sensor gating, as the uncertainty automatically adjusts the model to the input data. 
    more » « less
  2. Quantifying uncertainties for machine learning models is a critical step to reduce human verification effort by detecting predictions with low confidence. This paper proposes a method for uncertainty quantification (UQ) of table structure recognition (TSR). The proposed UQ method is built upon a mixture-of-expert approach termed Test-Time Augmentation (TTA). Our key idea is to enrich and diversify the table representations, to spotlight the cells with high recognition uncertainties. To evaluate the effectiveness, we proposed two heuristics to differentiate highly uncertain cells from normal cells, namely, masking and cell complexity quantification. Masking involves varying the pixel intensity to deem the detection uncertainty. Cell complexity quantification gauges the uncertainty of each cell by its topological relation with neighboring cells. The evaluation results based on standard benchmark datasets demonstrate that the proposed method is effective in quantifying uncertainty in TSR models. To our best knowledge, this study is the first of its kind to enable UQ in TSR tasks. 
    more » « less
  3. Conformal prediction (CP) is an important tool for distribution-free predictive uncertainty quantification. Yet, a major challenge is to balance computational efficiency and prediction accuracy, particularly for multiple predictions. We propose Leave-One-Out Stable Conformal Prediction (LOO-StabCP), a novel method to speed up full conformal using algorithmic stability without sample splitting. By leveraging leave-one-out stability, our method is much faster in handling a large number of prediction requests compared to existing method RO-StabCP based on replace-one stability. We derived stability bounds for several popular machine learning tools: regularized loss minimization (RLM) and stochastic gradient descent (SGD), as well as kernel method, neural networks and bagging. Our method is theoretically justified and demonstrates superior numerical performance on synthetic and real-world data. We applied our method to a screening problem, where its effective exploitation of training data led to improved test power compared to state-of-the-art method based on split conformal. 
    more » « less
  4. null (Ed.)
    This paper presents a novel framework for training convolutional neural networks (CNNs) to quantify the impact of gradual and abrupt uncertainties in the form of adversarial attacks. Uncertainty quantification is achieved by combining the CNN with a Gaussian process (GP) classifier algorithm. The variance of the GP quantifies the impact on the uncertainties and especially their effect on the object classification tasks. Learning from uncertainty provides the proposed CNN-GP framework with flexibility, reliability and robustness to adversarial attacks. The proposed approach includes training the network under noisy conditions. This is accomplished by comparing predictions with classification labels via the Kullback-Leibler divergence, Wasserstein distance and maximum correntropy. The network performance is tested on the classical MNIST, Fashion-MNIST, CIFAR10 and CIFAR 100 datasets. Further tests on robustness to both black-box and white-box attacks are also carried out for MNIST. The results show that the testing accuracy improves for networks that backpropogate uncertainty as compared to methods that do not quantify the impact of uncertainties. A comparison with a state-of-art Monte Carlo dropout method is also presented and the outperformance of the CNN-GP framework with respect to reliability and computational efficiency is demonstrated. 
    more » « less
  5. null (Ed.)
    Uncertainty is a common feature in first-principles models that are widely used in various engineering problems. Uncertainty quantification (UQ) has become an essential procedure to improve the accuracy and reliability of model predictions. Polynomial chaos expansion (PCE) has been used as an efficient approach for UQ by approximating uncertainty with orthogonal polynomial basis functions of standard distributions (e.g., normal) chosen from the Askey scheme. However, uncertainty in practice may not be represented well by standard distributions. In this case, the convergence rate and accuracy of the PCE-based UQ cannot be guaranteed. Further, when models involve non-polynomial forms, the PCE-based UQ can be computationally impractical in the presence of many parametric uncertainties. To address these issues, the Gram–Schmidt (GS) orthogonalization and generalized dimension reduction method (gDRM) are integrated with the PCE in this work to deal with many parametric uncertainties that follow arbitrary distributions. The performance of the proposed method is demonstrated with three benchmark cases including two chemical engineering problems in terms of UQ accuracy and computational efficiency by comparison with available algorithms (e.g., non-intrusive PCE). 
    more » « less