The difficulty of simulating quantum dynamics depends on the norm of the Hamiltonian. When the Hamiltonian varies with time, the simulation complexity should only depend on this quantity instantaneously. We develop quantum simulation algorithms that exploit this intuition. For sparse Hamiltonian simulation, the gate complexity scales with the L 1 norm ∫ 0 t d τ ‖ H ( τ ) ‖ max , whereas the best previous results scale with t max τ ∈ [ 0 , t ] ‖ H ( τ ) ‖ max . We also show analogous results for Hamiltonians that are linear combinations of unitaries. Our approaches thus provide an improvement over previous simulation algorithms that can be substantial when the Hamiltonian varies significantly. We introduce two new techniques: a classical sampler of time-dependent Hamiltonians and a rescaling principle for the Schrödinger equation. The rescaled Dyson-series algorithm is nearly optimal with respect to all parameters of interest, whereas the sampling-based approach is easier to realize for near-term simulation. These algorithms could potentially be applied to semi-classical simulations of scattering processes in quantum chemistry.
more »
« less
A Comparison of Three Ways to Measure Time-Dependent Densities With Quantum Simulators
Quantum algorithms are touted as a way around some classically intractable problems such as the simulation of quantum mechanics. At the end of all quantum algorithms is a quantum measurement whereby classical data is extracted and utilized. In fact, many of the modern hybrid-classical approaches are essentially quantum measurements of states with short quantum circuit descriptions. Here, we compare and examine three methods of extracting the time-dependent one-particle probability density from a quantum simulation: directZ-measurement, Bayesian phase estimation, and harmonic inversion. We have tested these methods in the context of the potential inversion problem of time-dependent density functional theory. Our test results suggest that direct measurement is the preferable method. We also highlight areas where the other two methods may be useful and report on tests using Rigetti's quantum virtual device. This study provides a starting point for imminent applications of quantum computing.
more »
« less
- Award ID(s):
- 1820747
- PAR ID:
- 10477376
- Publisher / Repository:
- Frontiers Media
- Date Published:
- Journal Name:
- Frontiers in Physics
- Volume:
- 9
- ISSN:
- 2296-424X
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Quantum Approximate Optimization algorithm (QAOA) aims to search for approximate solutions to discrete optimization problems with near-term quantum computers. As there are no algorithmic guarantee possible for QAOA to outperform classical computers, without a proof that bounded-error quantum polynomial time (BQP) ≠ nondeterministic polynomial time (NP), it is necessary to investigate the empirical advantages of QAOA. We identify a computational phase transition of QAOA when solving hard problems such as SAT—random instances are most difficult to train at a critical problem density. We connect the transition to the controllability and the complexity of QAOA circuits. Moreover, we find that the critical problem density in general deviates from the SAT-UNSAT phase transition, where the hardest instances for classical algorithms lies. Then, we show that the high problem density region, which limits QAOA’s performance in hard optimization problems (reachability deficits), is actually a good place to utilize QAOA: its approximation ratio has a much slower decay with the problem density, compared to classical approximate algorithms. Indeed, it is exactly in this region that quantum advantages of QAOA over classical approximate algorithms can be identified.more » « less
-
We compare the performance of the Quantum Approximate Optimization Algorithm (QAOA) with state-of-the-art classical solvers Gurobi and MQLib to solve the MaxCut problem on 3-regular graphs. We identify the minimum noiseless sampling frequency and depthprequired for a quantum device to outperform classical algorithms. There is potential for quantum advantage on hundreds of qubits and moderate depth with a sampling frequency of 10 kHz. We observe, however, that classical heuristic solvers are capable of producing high-quality approximate solutions in linear time complexity. In order to match this quality for large graph sizesN, a quantum device must support depthp > 11. Additionally, multi-shot QAOA is not efficient on large graphs, indicating that QAOAp ≤ 11 does not scale withN. These results limit achieving quantum advantage for QAOA MaxCut on 3-regular graphs. Other problems, such as different graphs, weighted MaxCut, and 3-SAT, may be better suited for achieving quantum advantage on near-term quantum devices.more » « less
-
Recent advancements in quantum technologies, particularly in quantum sensing and simulation, have facilitated the generation and analysis of inherently quantum data. This progress underscores the necessity for developing efficient and scalable quantum data management strategies. This goal faces immense challenges due to the exponential dimensionality of quantum data and its unique quantum properties such as no-cloning and measurement stochasticity. Specifically, classical storage and manipulation of an arbitrary n-qubit quantum state requires exponential space and time. Hence, there is a critical need to revisit foundational data management concepts and algorithms for quantum data. In this paper, we propose succinct quantum data sketches to support basic database operations such as search and selection. We view our work as an initial step towards the development of quantum data management model, opening up many possibilities for future research in this direction.more » « less
-
Recent advancements in quantum technologies, particularly in quantum sensing and simulation, have facilitated the generation and analysis of inherently quantum data. This progress underscores the necessity for developing efficient and scalable quantum data management strategies. This goal faces immense challenges due to the exponential dimensionality of quantum data and its unique quantum properties such as no-cloning and measurement stochasticity. Specifically, classical storage and manipulation of an arbitrary n-qubit quantum state requires exponential space and time. Hence, there is a critical need to revisit foundational data management concepts and algorithms for quantum data. In this paper, we propose succinct quantum data sketches to support basic database operations such as search and selection. We view our work as an initial step towards the development of quantum data management model, opening up many possibilities for future research in this direction.more » « less
An official website of the United States government

