skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Precision Bounds on Continuous-Variable State Tomography Using Classical Shadows
Shadow tomography is a framework for constructing succinct descriptions of quantum states using randomized measurement bases, called β€œclassical shadows,” with powerful methods to bound the estimators used. We recast existing experimental protocols for continuous-variable quantum state tomography in the classical-shadow framework, obtaining rigorous bounds on the number of independent measurements needed for estimating density matrices from these protocols. We analyze the efficiency of homodyne, heterodyne, photon-number-resolving, and photon-parity protocols. To reach a desired precision on the classical shadow of an N-photon density matrix with high probability, we show that homodyne detection requires order O(N4+1/3) measurements in the worst case, whereas photon-number-resolving and photon-parity detection require O(N4) measurements in the worst case (both up to logarithmic corrections). We benchmark these results against numerical simulation as well as experimental data from optical homodyne experiments. We find that numerical and experimental analyses of homodyne tomography match closely with our theoretical predictions. We extend our single-mode results to an efficient construction of multimode shadows based on local measurements.  more » « less
Award ID(s):
2120757
PAR ID:
10505826
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
American Physical Society
Date Published:
Journal Name:
PRX Quantum 5
Volume:
5
Issue:
1
ISSN:
2691-3399
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Estimating expectation values is a key subroutine in quantum algorithms. Near-term implementations face two major challenges: a limited number of samples required to learn a large collection of observables, and the accumulation of errors in devices without quantum error correction. To address these challenges simultaneously, we develop a quantum error-mitigation strategy calledsymmetry-adjusted classical shadows, by adjusting classical-shadow tomography according to how symmetries are corrupted by device errors. As a concrete example, we highlight global U(1) symmetry, which manifests in fermions as particle number and in spins as total magnetization, and illustrate their group-theoretic unification with respective classical-shadow protocols. We establish rigorous sampling bounds under readout errors obeying minimal assumptions, and perform numerical experiments with a more comprehensive model of gate-level errors derived from existing quantum processors. Our results reveal symmetry-adjusted classical shadows as a low-cost strategy to mitigate errors from noisy quantum experiments in the ubiquitous presence of symmetry. 
    more » « less
  2. The prototype quantum random number (random bit) generator (QRNG) consists of one photon at a time falling on a 50:50 beam splitter followed by random detection in one or the other output beams due to the irreducible probabilistic nature of quantum mechanics. Due to the difficulties in producing single photons on demand, in practice, pulses of weak coherent (laser) light are used. In this paper, we take a different approach, one that uses moderate coherent light. It is shown that a QRNG can be implemented by performing photon-number parity measurements. For moderate coherent light, the probabilities of obtaining even or odd parity in photon counts are 0.5 each. Photon counting with single-photon resolution can be performed through use of a cascade of beam splitters and single-photon detectors, as was done recently in a photon-number parity-based interferometry experiment involving coherent light. We highlight the point that unlike most quantum-based random number generators, our proposal does not require the use of classical de-biasing algorithms or post-processing of the generated bit sequence. 
    more » « less
  3. Classical shadows (CS) offer a resource-efficient means to estimate quantum observables, circumventing the need for exhaustive state tomography. Here, we clarify and explore the connection between CS techniques and least squares (LS) and regularized least squares (RLS) methods commonly used in machine learning and data analysis. By formal identification of LS and RLS ``shadows'' completely analogous to those in CS---namely, point estimators calculated from the empirical frequencies of single measurements---we show that both RLS and CS can be viewed as regularizers for the underdetermined regime, replacing the pseudoinverse with invertible alternatives. Through numerical simulations, we evaluate RLS and CS from three distinct angles: the tradeoff in bias and variance, mismatch between the expected and actual measurement distributions, and the interplay between the number of measurements and number of shots per measurement.Compared to CS, RLS attains lower variance at the expense of bias, is robust to distribution mismatch, and is more sensitive to the number of shots for a fixed number of state copies---differences that can be understood from the distinct approaches taken to regularization. Conceptually, our integration of LS, RLS, and CS under a unifying ``shadow'' umbrella aids in advancing the overall picture of CS techniques, while practically our results highlight the tradeoffs intrinsic to these measurement approaches, illuminating the circumstances under which either RLS or CS would be preferred, such as unverified randomness for the former or unbiased estimation for the latter. 
    more » « less
  4. The authors provide the first tight sample complexity bounds for shadow tomography and classical shadows in the regime where the target error is below some sufficiently small inverse polynomial in the dimension of the Hilbert space. Specifically, they present a protocol that, given any π‘š ∈ 𝑁 m∈N and πœ– ≀ 𝑂 ( 𝑑 βˆ’ 1 / 2 ) ϡ≀O(d βˆ’1/2 ), measures 𝑂 ( log ⁑ ( π‘š ) / πœ– 2 ) O(log(m)/Ο΅ 2 ) copies of an unknown mixed state 𝜌 ∈ 𝐢 𝑑 Γ— 𝑑 ρ∈C dΓ—d and outputs a classical description of 𝜌 ρ. This description can then be used to estimate any collection of π‘š m observables to within additive accuracy πœ– Ο΅. Previously, even for the simpler case of shadow tomography where observables are known in advance, the best known rates either scaled benignly but suboptimally in all of π‘š , 𝑑 , πœ– m,d,Ο΅, or scaled optimally in πœ– , π‘š Ο΅,m but included additional polynomial factors in 𝑑 d. Interestingly, the authors also show via dimensionality reduction that one can rescale πœ– Ο΅ and 𝑑 d to reduce to the regime where πœ– ≀ 𝑂 ( 𝑑 βˆ’ 1 / 2 ) ϡ≀O(d βˆ’1/2 ). Their algorithm draws on representation-theoretic tools developed in the context of full state tomography. 
    more » « less
  5. Homodyne detection is a common self-referenced technique to extract optical quadratures. Due to ubiquitous fluctuations, experiments measuring optical quadratures require homodyne angle control. Current homodyne angle locking techniques only provide high quality error signals in a span significantly smaller thanΟ€radians, the span required for full state tomography, leading to inevitable discontinuities during full tomography. Here, we present and demonstrate a locking technique using a universally tunable modulator which produces high quality error signals at an arbitrary homodyne angle. Our work enables continuous full-state tomography and paves the way to backaction evasion protocols based on a time-varying homodyne angle. 
    more » « less