skip to main content


Search for: All records

Creators/Authors contains: "Richerme, Philip"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract

    Trapped-ion quantum simulators have demonstrated a long history of studying the physics of interacting spin-lattice systems using globally addressed entangling operations. Yet despite the multitude of studies so far, most have been limited to studying variants of the same spin interaction model, namely an Ising model with power-law decay in the couplings. Here, we demonstrate that much broader classes of effective spin–spin interactions are achievable using exclusively global driving fields. Specifically, we find that these new categories of interaction graphs become achievable with perfect or near-perfect theoretical fidelity by tailoring the coupling of the driving fields to each vibrational mode of the ion crystal. Given the relation between the ion crystal vibrational modes and the accessible interaction graphs, we show how the accessible interaction graph set can be further expanded by shaping the trapping potential to include specific anharmonic terms. Finally, we derive a rigorous test to determine whether a desired interaction graph is accessible using only globally driven fields. These tools broaden the reach of trapped-ion quantum simulators so that they may more easily address open questions in materials science and quantum chemistry.

     
    more » « less
  2. Abstract

    State-of-the-art quantum machine learning (QML) algorithms fail to offer practical advantages over their notoriously powerful classical counterparts, due to the limited learning capabilities of QML algorithms, the constrained computational resources available on today’s noisy intermediate-scale quantum (NISQ) devices, and the empirically designed circuit ansatz for QML models. In this work, we address these challenges by proposing a hybrid quantum–classical neural network (CaNN), which we call QCLIP, for Quantum Contrastive Language-Image Pre-Training. Rather than training a supervised QML model to predict human annotations, QCLIP focuses on more practical transferable visual representation learning, where the developed model can be generalized to work on unseen downstream datasets. QCLIP is implemented by using CaNNs to generate low-dimensional data feature embeddings followed by quantum neural networks to adapt and generalize the learned representation in the quantum Hilbert space. Experimental results show that the hybrid QCLIP model can be efficiently trained for representation learning. We evaluate the representation transfer capability of QCLIP against the classical Contrastive Language-Image Pre-Training model on various datasets. Simulation results and real-device results on NISQIBM_Aucklandquantum computer both show that the proposed QCLIP model outperforms the classical CLIP model in all test cases. As the field of QML on NISQ devices is continually evolving, we anticipate that this work will serve as a valuable foundation for future research and advancements in this promising area.

     
    more » « less
  3. Free, publicly-accessible full text available August 17, 2024