skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Two entanglement conditions and their connection to negativity
We examine two conditions that can be used to detect bipartite entanglement, and show that they can be used to provide lower bounds on the negativity of states. We begin with two-qubit states, and then show how what was done there can be extended to more general states. The resulting bounds are then studied by means of a number of examples. We also show that if one has some knowledge of the Schmidt vectors of a state, better bounds can be found.  more » « less
Award ID(s):
2106449
PAR ID:
10523369
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
American Physical Society
Date Published:
Journal Name:
Physical Review A
ISSN:
1094-1622
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Motivated by the increasing need to understand the distributed algorithmic foundations of large-scale graph computations, we study some fundamental graph problems in a message-passing model for distributed computing where k ≥ 2 machines jointly perform computations on graphs with n nodes (typically, n >> k). The input graph is assumed to be initially randomly partitioned among the k machines, a common implementation in many real-world systems. Communication is point-to-point, and the goal is to minimize the number of communication rounds of the computation. Our main contribution is the General Lower Bound Theorem , a theorem that can be used to show non-trivial lower bounds on the round complexity of distributed large-scale data computations. This result is established via an information-theoretic approach that relates the round complexity to the minimal amount of information required by machines to solve the problem. Our approach is generic, and this theorem can be used in a “cookbook” fashion to show distributed lower bounds for several problems, including non-graph problems. We present two applications by showing (almost) tight lower bounds on the round complexity of two fundamental graph problems, namely, PageRank computation and triangle enumeration . These applications show that our approach can yield lower bounds for problems where the application of communication complexity techniques seems not obvious or gives weak bounds, including and especially under a stochastic partition of the input. We then present distributed algorithms for PageRank and triangle enumeration with a round complexity that (almost) matches the respective lower bounds; these algorithms exhibit a round complexity that scales superlinearly in k , improving significantly over previous results [Klauck et al., SODA 2015]. Specifically, we show the following results: PageRank: We show a lower bound of Ὼ(n/k 2 ) rounds and present a distributed algorithm that computes an approximation of the PageRank of all the nodes of a graph in Õ(n/k 2 ) rounds. Triangle enumeration: We show that there exist graphs with m edges where any distributed algorithm requires Ὼ(m/k 5/3 ) rounds. This result also implies the first non-trivial lower bound of Ὼ(n 1/3 ) rounds for the congested clique model, which is tight up to logarithmic factors. We then present a distributed algorithm that enumerates all the triangles of a graph in Õ(m/k 5/3 + n/k 4/3 ) rounds. 
    more » « less
  2. null (Ed.)
    Practical error analysis is essential for the design, optimization, and evaluation of Noisy Intermediate-Scale Quantum(NISQ) computing. However, bounding errors in quantum programs is a grand challenge, because the effects of quantum errors depend on exponentially large quantum states. In this work, we present Gleipnir, a novel methodology toward practically computing verified error bounds in quantum programs. Gleipnir introduces the (ρ,δ)-diamond norm, an error metric constrained by a quantum predicate consisting of the approximate state ρ and its distance δ to the ideal state ρ. This predicate (ρ,δ) can be computed adaptively using tensor networks based on the Matrix Product States. Gleipnir features a lightweight logic for reasoning about error bounds in noisy quantum programs, based on the (ρ,δ)-diamond norm metric. Our experimental results show that Gleipnir is able to efficiently generate tight error bounds for real-world quantum programs with 10 to 100 qubits, and can be used to evaluate the error mitigation performance of quantum compiler transformations. 
    more » « less
  3. Practical error analysis is essential for the design, optimization, and evaluation of Noisy Intermediate-Scale Quantum(NISQ) computing. However, bounding errors in quantum programs is a grand challenge, because the effects of quantum errors depend on exponentially large quantum states. In this work, we present Gleipnir, a novel methodology toward practically computing verified error bounds in quantum programs. Gleipnir introduces the (ρ,δ)-diamond norm, an error metric constrained by a quantum predicate consisting of the approximate state ρ and its distance δ to the ideal state ρ. This predicate (ρ,δ) can be computed adaptively using tensor networks based on the Matrix Product States. Gleipnir features a lightweight logic for reasoning about error bounds in noisy quantum programs, based on the (ρ,δ)-diamond norm metric. Our experimental results show that Gleipnir is able to efficiently generate tight error bounds for real-world quantum programs with 10 to 100 qubits, and can be used to evaluate the error mitigation performance of quantum compiler transformations. 
    more » « less
  4. null (Ed.)
    A double occurrence word (DOW) is a word in which every symbol appears exactly twice. We define the symbol separation of a DOW [Formula: see text] to be the number of letters between the two copies of a symbol, and the separation of [Formula: see text] to be the sum of separations over all symbols in [Formula: see text]. We then analyze relationship among size, reducibility and separation of DOWs. Specifically, we provide tight bounds of separations of DOWs with a given size and characterize the words that attain those bounds. We show that all separation numbers within the bounds can be realized. We present recursive formulas for counting the numbers of DOWs with a given separation under various restrictions, such as the number of irreducible factors. These formulas can be obtained by inductive construction of all DOWs with the given separation. 
    more » « less
  5. Abstract We present progress on three old conjectures about longest paths and cycles in graphs. The first pair of conjectures, due to Lovász from 1969 and Thomassen from 1978, respectively, states that all connected vertex‐transitive graphs contain a Hamiltonian path, and that all sufficiently large such graphs even contain a Hamiltonian cycle. The third conjecture, due to Smith from 1984, states that for in every ‐connected graph any two longest cycles intersect in at least vertices. In this paper, we prove a new lemma about the intersection of longest cycles in a graph, which can be used to improve the best known bounds toward all the aforementioned conjectures: First, we show that every connected vertex‐transitive graph on vertices contains a cycle (and hence path) of length at least , improving on from DeVos [arXiv:2302:04255, 2023]. Second, we show that in every ‐connected graph with , any two longest cycles meet in at least vertices, improving on from Chen, Faudree, and Gould [J. Combin. Theory, Ser. B,72(1998) no. 1, 143–149]. Our proof combines combinatorial arguments, computer search, and linear programming. 
    more » « less