skip to main content


Title: Variance-Based Sensitivity Analysis of Λ-type Quantum Memory

We examine the sensitivity of Λ-type optical quantum memories to experimental fluctuations using a variance-based analysis. The results agree with physical interpretations of quantum memory protocols, and are important for practical implementations.

 
more » « less
Award ID(s):
1806572 1839177 1640968 1936321
NSF-PAR ID:
10381518
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Conference on Lasers and Electro-Optics
Page Range / eLocation ID:
JTu3A.7
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Quantum computational supremacy arguments, which describe a way for a quantum computer to perform a task that cannot also be done by a classical computer, typically require some sort of computational assumption related to the limitations of classical computation. One common assumption is that the polynomial hierarchy ( P H ) does not collapse, a stronger version of the statement that P ≠ N P , which leads to the conclusion that any classical simulation of certain families of quantum circuits requires time scaling worse than any polynomial in the size of the circuits. However, the asymptotic nature of this conclusion prevents us from calculating exactly how many qubits these quantum circuits must have for their classical simulation to be intractable on modern classical supercomputers. We refine these quantum computational supremacy arguments and perform such a calculation by imposing fine-grained versions of the non-collapse conjecture. Our first two conjectures poly3-NSETH( a ) and per-int-NSETH( b ) take specific classical counting problems related to the number of zeros of a degree-3 polynomial in n variables over F 2 or the permanent of an n × n integer-valued matrix, and assert that any non-deterministic algorithm that solves them requires 2 c n time steps, where c ∈ { a , b } . A third conjecture poly3-ave-SBSETH( a ′ ) asserts a similar statement about average-case algorithms living in the exponential-time version of the complexity class S B P . We analyze evidence for these conjectures and argue that they are plausible when a = 1 / 2 , b = 0.999 and a ′ = 1 / 2 .Imposing poly3-NSETH(1/2) and per-int-NSETH(0.999), and assuming that the runtime of a hypothetical quantum circuit simulation algorithm would scale linearly with the number of gates/constraints/optical elements, we conclude that Instantaneous Quantum Polynomial-Time (IQP) circuits with 208 qubits and 500 gates, Quantum Approximate Optimization Algorithm (QAOA) circuits with 420 qubits and 500 constraints and boson sampling circuits (i.e. linear optical networks) with 98 photons and 500 optical elements are large enough for the task of producing samples from their output distributions up to constant multiplicative error to be intractable on current technology. Imposing poly3-ave-SBSETH(1/2), we additionally rule out simulations with constant additive error for IQP and QAOA circuits of the same size. Without the assumption of linearly increasing simulation time, we can make analogous statements for circuits with slightly fewer qubits but requiring 10 4 to 10 7 gates. 
    more » « less
  2. Abstract

    To achieve universal quantum computation via general fault-tolerant schemes, stabilizer operations must be supplemented with other non-stabilizer quantum resources. Motivated by this necessity, we develop a resource theory for magic quantum channels to characterize and quantify the quantum ‘magic’ or non-stabilizerness of noisy quantum circuits. For qudit quantum computing with odd dimensiond, it is known that quantum states with non-negative Wigner function can be efficiently simulated classically. First, inspired by this observation, we introduce a resource theory based on completely positive-Wigner-preserving quantum operations as free operations, and we show that they can be efficiently simulated via a classical algorithm. Second, we introduce two efficiently computable magic measures for quantum channels, called the mana and thauma of a quantum channel. As applications, we show that these measures not only provide fundamental limits on the distillable magic of quantum channels, but they also lead to lower bounds for the task of synthesizing non-Clifford gates. Third, we propose a classical algorithm for simulating noisy quantum circuits, whose sample complexity can be quantified by the mana of a quantum channel. We further show that this algorithm can outperform another approach for simulating noisy quantum circuits, based on channel robustness. Finally, we explore the threshold of non-stabilizerness for basic quantum circuits under depolarizing noise.

     
    more » « less
  3. Optical photons are powerful carriers of quantum information, which can be delivered in free space by satellites or in fibers on the ground over long distances. Entanglement of quantum states over long distances can empower quantum computing, quantum communications, and quantum sensing. Quantum optical memories are devices designed to store quantum information in the form of stationary excitations, such as atomic coherence, and are capable of coherently mapping these excitations to flying qubits. Quantum memories can effectively store and manipulate quantum states, making them indispensable elements in future long-distance quantum networks. Over the past two decades, quantum optical memories with high fidelities, high efficiencies, long storage times, and promising multiplexing capabilities have been developed, especially at the single-photon level. In this review, we introduce the working principles of commonly used quantum memory protocols and summarize the recent advances in quantum memory demonstrations. We also offer a vision for future quantum optical memory devices that may enable entanglement distribution over long distances.

     
    more » « less
  4. Abstract

    Visualizations have played a crucial role in helping quantum computing users explore quantum states in various quantum computing applications. Among them, Bloch Sphere is the widely‐used visualization for showing quantum states, which leverages angles to represent quantum amplitudes. However, it cannot support the visualization of quantum entanglement and superposition, the two essential properties of quantum computing. To address this issue, we propose VENUS, a novel visualization for quantum state representation. By explicitly correlating 2D geometric shapes based on the math foundation of quantum computing characteristics, VENUS effectively represents quantum amplitudes of both the single qubit and two qubits for quantum entanglement. Also, we use multiple coordinated semicircles to naturally encode probability distribution, making the quantum superposition intuitive to analyze. We conducted two well‐designed case studies and an in‐depth expert interview to evaluate the usefulness and effectiveness of VENUS. The result shows that VENUS can effectively facilitate the exploration of quantum states for the single qubit and two qubits.

     
    more » « less
  5. We present an algorithmic framework for quantum-inspired classical algorithms on close-to-low-rank matrices, generalizing the series of results started by Tang’s breakthrough quantum-inspired algorithm for recommendation systems [STOC’19]. Motivated by quantum linear algebra algorithms and the quantum singular value transformation (SVT) framework of Gilyén et al. [STOC’19], we develop classical algorithms for SVT that run in time independent of input dimension, under suitable quantum-inspired sampling assumptions. Our results give compelling evidence that in the corresponding QRAM data structure input model, quantum SVT does not yield exponential quantum speedups. Since the quantum SVT framework generalizes essentially all known techniques for quantum linear algebra, our results, combined with sampling lemmas from previous work, suffice to generalize all prior results about dequantizing quantum machine learning algorithms. In particular, our classical SVT framework recovers and often improves the dequantization results on recommendation systems, principal component analysis, supervised clustering, support vector machines, low-rank regression, and semidefinite program solving. We also give additional dequantization results on low-rank Hamiltonian simulation and discriminant analysis. Our improvements come from identifying the key feature of the quantum-inspired input model that is at the core of all prior quantum-inspired results: ℓ2-norm sampling can approximate matrix products in time independent of their dimension. We reduce all our main results to this fact, making our exposition concise, self-contained, and intuitive.

     
    more » « less