skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Entropic Uncertainty for Biased Measurements
Entropic Uncertainty relations are powerful tools, especially in quantum cryptography. They typically bound the amount of uncertainty a third-party adversary may hold on a measurement outcome as a result of the measurement overlap. However, when the two measurement bases are biased towards one another, standard entropic uncertainty relations do not always provide optimal lower bounds on the entropy. Here, we derive a new entropic uncertainty relation, for certain quantum states, which can provide a significantly higher bound even if the two measurement bases are no longer mutually unbiased. We evaluate our bound on two different quantum cryptographic protocols, including BB84 with faulty/biased measurement devices, and show that our new bound can produce substantially higher key-rates under several scenarios when compared with prior work using standard entropic uncertainty relations.  more » « less
Award ID(s):
2143644
PAR ID:
10492848
Author(s) / Creator(s):
Publisher / Repository:
IEEE
Date Published:
Journal Name:
IEEE QCE
ISBN:
979-8-3503-4323-6
Page Range / eLocation ID:
1220 to 1230
Format(s):
Medium: X
Location:
Bellevue, WA, USA
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Two‐way quantum key distribution (QKD) protocols utilize bi‐directional quantum communication to establish a shared secret key. Due to the increased attack surface, security analyses remain challenging. Here a high‐dimensional variant of the Ping Pong protocol is investigated and an information theoretic security analysis in the finite‐key setting is performed. The main contribution in this work is to show a new proof methodology for two‐way quantum key distribution protocols based on the quantum sampling framework of Bouman and Fehr introduced in 2010 and also sampling‐based entropic uncertainty relations introduced by the authors in 2019. The Ping Pong protocol is only investigated here, but these methods may be broadly applicable to other QKD protocols, especially those relying on two‐way channels. Along the way, some fascinating benefits to high‐dimensional quantum states applied to two‐way quantum communication are also showed. 
    more » « less
  2. Measurement-based quantum computing (MBQC) is an alternative model of quantum computation that is equivalent to the standard gate-based model and is the preferred approach for several optical quantum computing architectures. In MBQC, a quantum computation is executed by preparing an entangled cluster state and then selectively measuring qubits. MBQC can be made fault-tolerant by creating an MBQC computation that executes the standard surface code, an approach known as "foliation." Recent results on gate-based quantum computing have demonstrated that in the presence of biased noise, a modified version of the surface code known as the XZZX code has much higher thresholds than the standard surface code. However, naively foliating the XZZX code does not result in a high-threshold fault-tolerant MBQC, because the foliation procedure does not preserve the noise bias of the physical qubits. To create a high-threshold fault-tolerant MBQC, we introduce a modified cluster state that preserves the bias, and use our modified cluster state to construct an MBQC computation that executes the XZZX code. Using full circuit-level noise simulations, we show that the threshold of our modified MBQC is higher than either the standard fault-tolerant MBQC or the naïve foliated XZZX code in the presence of biased noise, demonstrating the advantage of our approach. 
    more » « less
  3. Abstract The inconsistency between experiments in the measurements of the local Universe expansion rate, the Hubble constant, suggests unknown systematics in the existing experiments or new physics. Gravitational-wave standard sirens, a method to independently provide direct measurements of the Hubble constant, have the potential to address this tension. Before that, it is critical to ensure there are no substantial systematics in the standard siren method. A significant systematic has been identified when the viewing angle of the gravitational-wave sources, the compact binary coalescences, was inferred inaccurately from electromagnetic observations of the sources. Such a systematic has led to a more than 10% discrepancy in the standard siren Hubble constant measurements with the observations of binary neutron star merger, GW170817. In this Letter, we develop a new formalism to infer and mitigate this systematic. We demonstrate that the systematic uncertainty of the Hubble constant measurements can be reduced to a level smaller than their statistical uncertainty with 5, 10, and 20 binary neutron star merger observations. We show that our formalism successfully reduces the systematics even if the shape of the biased viewing angle distribution does not follow precisely the model we choose. Our formalism ensures unbiased standard siren Hubble constant measurements when the binary viewing angles are inferred from electromagnetic observations. 
    more » « less
  4. Although many fairness criteria have been proposed to ensure that machine learning algorithms do not exhibit or amplify our existing social biases, these algorithms are trained on datasets that can themselves be statistically biased. In this paper, we investigate the robustness of existing (demographic) fairness criteria when the algorithm is trained on biased data. We consider two forms of dataset bias: errors by prior decision makers in the labeling process, and errors in the measurement of the features of disadvantaged individuals. We analytically show that some constraints (such as Demographic Parity) can remain robust when facing certain statistical biases, while others (such as Equalized Odds) are significantly violated if trained on biased data. We provide numerical experiments based on three real-world datasets (the FICO, Adult, and German credit score datasets) supporting our analytical findings. While fairness criteria are primarily chosen under normative considerations in practice, our results show that naively applying a fairness constraint can lead to not only a loss in utility for the decision maker, but more severe unfairness when data bias exists. Thus, understanding how fairness criteria react to different forms of data bias presents a critical guideline for choosing among existing fairness criteria, or for proposing new criteria, when available datasets may be biased. 
    more » « less
  5. Abstract Fault-tolerant cluster states form the basis for scalable measurement-based quantum computation. Recently, new stabilizer codes for scalable circuit-based quantum computation have been introduced that have very high thresholds under biased noise where the qubit predominantly suffers from one type of error, e.g. dephasing. However, extending these advances in stabilizer codes to generate high-threshold cluster states for biased noise has been a challenge, as the standard method for foliating stabilizer codes to generate fault-tolerant cluster states does not preserve the noise bias. In this work, we overcome this barrier by introducing a generalization of the cluster state that allows us to foliate stabilizer codes in a bias-preserving way. As an example of our approach, we construct a foliated version of the XZZX code which we call the XZZX cluster state. We demonstrate that under a circuit-level-noise model, our XZZX cluster state has a threshold more than double the usual cluster state when dephasing errors are more likely than errors that cause bit flips by a factor of order ~100 or more. 
    more » « less