We prove that
 Award ID(s):
 1818914
 NSFPAR ID:
 10339345
 Date Published:
 Journal Name:
 ArXivorg
 ISSN:
 23318422
 Format(s):
 Medium: X
 Sponsoring Org:
 National Science Foundation
More Like this

Abstract depth local random quantum circuits with two qudit nearestneighbor gates on a$${{\,\textrm{poly}\,}}(t) \cdot n^{1/D}$$ $\phantom{\rule{0ex}{0ex}}\text{poly}\phantom{\rule{0ex}{0ex}}\left(t\right)\xb7{n}^{1/D}$D dimensional lattice withn qudits are approximatet designs in various measures. These include the “monomial” measure, meaning that the monomials of a random circuit from this family have expectation close to the value that would result from the Haar measure. Previously, the best bound was due to Brandão–Harrow–Horodecki (Commun Math Phys 346(2):397–434, 2016) for$${{\,\textrm{poly}\,}}(t)\cdot n$$ $\phantom{\rule{0ex}{0ex}}\text{poly}\phantom{\rule{0ex}{0ex}}\left(t\right)\xb7n$ . We also improve the “scrambling” and “decoupling” bounds for spatially local random circuits due to Brown and Fawzi (Scrambling speed of random quantum circuits, 2012). One consequence of our result is that assuming the polynomial hierarchy ($$D=1$$ $D=1$ ) is infinite and that certain counting problems are$${{\,\mathrm{\textsf{PH}}\,}}$$ $\phantom{\rule{0ex}{0ex}}\mathrm{PH}\phantom{\rule{0ex}{0ex}}$ hard “on average”, sampling within total variation distance from these circuits is hard for classical computers. Previously, exact sampling from the outputs of even constantdepth quantum circuits was known to be hard for classical computers under these assumptions. However the standard strategy for extending this hardness result to approximate sampling requires the quantum circuits to have a property called “anticoncentration”, meaning roughly that the output has nearmaximal entropy. Unitary 2designs have the desired anticoncentration property. Our result improves the required depth for this level of anticoncentration from linear depth to a sublinear value, depending on the geometry of the interactions. This is relevant to a recent experiment by the Google Quantum AI group to perform such a sampling task with 53 qubits on a twodimensional lattice (Arute in Nature 574(7779):505–510, 2019; Boixo et al. in Nate Phys 14(6):595–600, 2018) (and related experiments by USTC), and confirms their conjecture that$$\#{\textsf{P}}$$ $\#P$ depth suffices for anticoncentration. The proof is based on a previous construction of$$O(\sqrt{n})$$ $O\left(\sqrt{n}\right)$t designs by Brandão et al. (2016), an analysis of how approximate designs behave under composition, and an extension of the quasiorthogonality of permutation operators developed by Brandão et al. (2016). Different versions of the approximate design condition correspond to different norms, and part of our contribution is to introduce the norm corresponding to anticoncentration and to establish equivalence between these various norms for lowdepth circuits. For random circuits with longrange gates, we use different methods to show that anticoncentration happens at circuit size corresponding to depth$$O(n\ln ^2 n)$$ $O\left(n{ln}^{2}n\right)$ . We also show a lower bound of$$O(\ln ^3 n)$$ $O\left({ln}^{3}n\right)$ for the size of such circuit in this case. We also prove that anticoncentration is possible in depth$$\Omega (n \ln n)$$ $\Omega (nlnn)$ (size$$O(\ln n \ln \ln n)$$ $O(lnnlnlnn)$ ) using a different model.$$O(n \ln n \ln \ln n)$$ $O(nlnnlnlnn)$ 
We prove concentration bounds for the following classes of quantum states: (i) output states of shallow quantum circuits, answering an open question from \cite{DMRF22}; (ii) injective matrix product states; (iii) output states of dense Hamiltonian evolution, i.e. states of the form e ι H ( p ) ⋯ e ι H ( 1 )  ψ 0 ⟩ for any n qubit product state  ψ 0 ⟩ , where each H ( i ) can be any local commuting Hamiltonian satisfying a norm constraint, including dense Hamiltonians with interactions between any qubits. Our proofs use polynomial approximations to show that these states are close to local operators. This implies that the distribution of the Hamming weight of a computational basis measurement (and of other related observables) concentrates.An example of (iii) are the states produced by the quantum approximate optimisation algorithm (QAOA). Using our concentration results for these states, we show that for a random spin model, the QAOA can only succeed with negligible probability even at superconstant level p = o ( log log n ) , assuming a strengthened version of the socalled overlap gap property. This gives the first limitations on the QAOA on dense instances at superconstant level, improving upon the recent result [BGMZ22].more » « less

null (Ed.)Motivated by the increasing need to understand the distributed algorithmic foundations of largescale graph computations, we study some fundamental graph problems in a messagepassing model for distributed computing where k ≥ 2 machines jointly perform computations on graphs with n nodes (typically, n >> k). The input graph is assumed to be initially randomly partitioned among the k machines, a common implementation in many realworld systems. Communication is pointtopoint, and the goal is to minimize the number of communication rounds of the computation. Our main contribution is the General Lower Bound Theorem , a theorem that can be used to show nontrivial lower bounds on the round complexity of distributed largescale data computations. This result is established via an informationtheoretic approach that relates the round complexity to the minimal amount of information required by machines to solve the problem. Our approach is generic, and this theorem can be used in a “cookbook” fashion to show distributed lower bounds for several problems, including nongraph problems. We present two applications by showing (almost) tight lower bounds on the round complexity of two fundamental graph problems, namely, PageRank computation and triangle enumeration . These applications show that our approach can yield lower bounds for problems where the application of communication complexity techniques seems not obvious or gives weak bounds, including and especially under a stochastic partition of the input. We then present distributed algorithms for PageRank and triangle enumeration with a round complexity that (almost) matches the respective lower bounds; these algorithms exhibit a round complexity that scales superlinearly in k , improving significantly over previous results [Klauck et al., SODA 2015]. Specifically, we show the following results: PageRank: We show a lower bound of Ὼ(n/k 2 ) rounds and present a distributed algorithm that computes an approximation of the PageRank of all the nodes of a graph in Õ(n/k 2 ) rounds. Triangle enumeration: We show that there exist graphs with m edges where any distributed algorithm requires Ὼ(m/k 5/3 ) rounds. This result also implies the first nontrivial lower bound of Ὼ(n 1/3 ) rounds for the congested clique model, which is tight up to logarithmic factors. We then present a distributed algorithm that enumerates all the triangles of a graph in Õ(m/k 5/3 + n/k 4/3 ) rounds.more » « less

Abstract Quantum Approximate Optimization algorithm (QAOA) aims to search for approximate solutions to discrete optimization problems with nearterm quantum computers. As there are no algorithmic guarantee possible for QAOA to outperform classical computers, without a proof that boundederror quantum polynomial time (BQP) ≠ nondeterministic polynomial time (NP), it is necessary to investigate the empirical advantages of QAOA. We identify a computational phase transition of QAOA when solving hard problems such as SAT—random instances are most difficult to train at a critical problem density. We connect the transition to the controllability and the complexity of QAOA circuits. Moreover, we find that the critical problem density in general deviates from the SATUNSAT phase transition, where the hardest instances for classical algorithms lies. Then, we show that the high problem density region, which limits QAOA’s performance in hard optimization problems (reachability deficits), is actually a good place to utilize QAOA: its approximation ratio has a much slower decay with the problem density, compared to classical approximate algorithms. Indeed, it is exactly in this region that quantum advantages of QAOA over classical approximate algorithms can be identified.

We study the classic set cover problem from the perspective of sublinear algorithms. Given access to a collection of m sets over n elements in the query model, we show that sublinear algorithms derived from existing techniques have almost tight query complexities. On one hand, first we show an adaptation of the streaming algorithm presented in [17] to the sublinear query model, that returns an αapproximate cover using Õ(m(n/k)^1/(α–1) + nk) queries to the input, where k denotes the value of a minimum set cover. We then complement this upper bound by proving that for lower values of k, the required number of queries is , even for estimating the optimal cover size. Moreover, we prove that even checking whether a given collection of sets covers all the elements would require Ω(nk) queries. These two lower bounds provide strong evidence that the upper bound is almost tight for certain values of the parameter k. On the other hand, we show that this bound is not optimal for larger values of the parameter k, as there exists a (1 + ε)approximation algorithm with Õ(mn/kε^2) queries. We show that this bound is essentially tight for sufficiently small constant ε, by establishing a lower bound of query complexity. Our lowerbound results follow by carefully designing two distributions of instances that are hard to distinguish. In particular, our first lower bound involves a probabilistic construction of a certain set system with a minimum set cover of size αk, with the key property that a small number of “almost uniformly distributed” modifications can reduce the minimum set cover size down to k. Thus, these modifications are not detectable unless a large number of queries are asked. We believe that our probabilistic construction technique might find applications to lower bounds for other combinatorial optimization problems.more » « less