skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Conditioned quantum-assisted deep generative surrogate for particle-calorimeter interactions
Abstract Particle collisions at accelerators like the Large Hadron Collider (LHC), recorded by experiments such as ATLAS and CMS, enable precise standard model measurements and searches for new phenomena. Simulating these collisions significantly influences experiment design and analysis but incurs immense computational costs, projected at millions of CPU-years annually during the high luminosity LHC (HL-LHC) phase. Currently, simulating a single event with Geant4 consumes around 1000 CPU seconds, with calorimeter simulations especially demanding. To address this, we propose a conditioned quantum-assisted generative model, integrating a conditioned variational autoencoder (VAE) and a conditioned restricted Boltzmann machine (RBM). Our RBM architecture is tailored for D-Wave’s Pegasus-structured advantage quantum annealer for sampling, leveraging the flux bias for conditioning. This approach combines classical RBMs as universal approximators for discrete distributions with quantum annealing’s speed and scalability. We also introduce an adaptive method for efficiently estimating effective inverse temperature, and validate our framework on Dataset 2 of CaloChallenge.  more » « less
Award ID(s):
2212550
PAR ID:
10613074
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ;
Publisher / Repository:
Nature Publishing Group
Date Published:
Journal Name:
npj Quantum Information
Volume:
11
Issue:
1
ISSN:
2056-6387
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. As we approach the High Luminosity Large Hadron Collider (HL-LHC) set to begin collisions by the end of this decade, it is clear that the computational demands of traditional collision simulations have become untenably high. Current methods, relying heavily on first-principles Monte Carlo simulations for event showers in calorimeters, are estimated to require millions of CPU-years annually, a demand that far exceeds current capabilities. This bottleneck presents a unique opportunity for breakthroughs in computational physics through the integration of generative AI with quantum computing technologies. We propose a Quantum-Assisted deep generative model. In particular, we combine a variational autoencoder (VAE) with a Restricted Boltzmann Machine (RBM) embedded in its latent space as a prior. The RBM in latent space provides further expressiveness compared to legacy VAE where the prior is a fixed Gaussian distribution. By crafting the RBM couplings, we leverage D-Wave’s Quantum Annealer to significantly speed up the shower sampling time. By combining classical and quantum computing, this framework sets a path towards utilizing large-scale quantum simulations as priors in deep generative models and demonstrate their ability to generate high-quality synthetic data for the HL-LHC experiments. 
    more » « less
  2. Abstract Data analyses in particle physics rely on an accurate simulation of particle collisions and a detailed simulation of detector effects to extract physics knowledge from the recorded data. Event generators together with ageant-based simulation of the detectors are used to produce large samples of simulated events for analysis by the LHC experiments. These simulations come at a high computational cost, where the detector simulation and reconstruction algorithms have the largest CPU demands. This article describes how machine-learning (ML) techniques are used to reweight simulated samples obtained with a given set of parameters to samples with different parameters or samples obtained from entirely different simulation programs. The ML reweighting method avoids the need for simulating the detector response multiple times by incorporating the relevant information in a single sample through event weights. Results are presented for reweighting to model variations and higher-order calculations in simulated top quark pair production at the LHC. This ML-based reweighting is an important element of the future computing model of the CMS experiment and will facilitate precision measurements at the High-Luminosity LHC. 
    more » « less
  3. Abstract Deep generative learning cannot only be used for generating new data with statistical characteristics derived from input data but also for anomaly detection, by separating nominal and anomalous instances based on their reconstruction quality. In this paper, we explore the performance of three unsupervised deep generative models—variational autoencoders (VAEs) with Gaussian, Bernoulli, and Boltzmann priors—in detecting anomalies in multivariate time series of commercial-flight operations. We created two VAE models with discrete latent variables (DVAEs), one with a factorized Bernoulli prior and one with a restricted Boltzmann machine (RBM) with novel positive-phase architecture as prior, because of the demand for discrete-variable models in machine-learning applications and because the integration of quantum devices based on two-level quantum systems requires such models. To the best of our knowledge, our work is the first that applies DVAE models to anomaly-detection tasks in the aerospace field. The DVAE with RBM prior, using a relatively simple—and classically or quantum-mechanically enhanceable—sampling technique for the evolution of the RBM’s negative phase, performed better in detecting anomalies than the Bernoulli DVAE and on par with the Gaussian model, which has a continuous latent space. The transfer of a model to an unseen dataset with the same anomaly but without re-tuning of hyperparameters or re-training noticeably impaired anomaly-detection performance, but performance could be improved by post-training on the new dataset. The RBM model was robust to change of anomaly type and phase of flight during which the anomaly occurred. Our studies demonstrate the competitiveness of a discrete deep generative model with its Gaussian counterpart on anomaly-detection problems. Moreover, the DVAE model with RBM prior can be easily integrated with quantum sampling by outsourcing its generative process to measurements of quantum states obtained from a quantum annealer or gate-model device. 
    more » « less
  4. Abstract The ALICE experiment was proposed in 1993, to study strongly-interacting matter at extreme energy densities and temperatures. This proposal entailed a comprehensive investigation of nuclear collisions at the LHC. Its physics programme initially focused on the determination of the properties of the quark–gluon plasma (QGP), a deconfined state of quarks and gluons, created in such collisions. The ALICE physics programme has been extended to cover a broader ensemble of observables related to Quantum Chromodynamics (QCD), the theory of strong interactions. The experiment has studied Pb–Pb, Xe–Xe, p–Pb and pp collisions in the multi-TeV centre of mass energy range, during the Run 1–2 data-taking periods at the LHC (2009–2018). The aim of this review is to summarise the key ALICE physics results in this endeavor, and to discuss their implications on the current understanding of the macroscopic and microscopic properties of strongly-interacting matter at the highest temperatures reached in the laboratory. It will review the latest findings on the properties of the QGP created by heavy-ion collisions at LHC energies, and describe the surprising QGP-like effects in pp and p–Pb collisions. Measurements of few-body QCD interactions, and their impact in unraveling the structure of hadrons and hadronic interactions, will be discussed. ALICE results relevant for physics topics outside the realm of QCD will also be touched upon. Finally, prospects for future measurements with the ALICE detector in the context of its planned upgrades will also be briefly described. 
    more » « less
  5. Abstract The ATLAS trigger system is a crucial component of the ATLAS experiment at the LHC. It is responsible for selecting events in line with the ATLAS physics programme. This paper presents an overview of the changes to the trigger and data acquisition system during the second long shutdown of the LHC, and shows the performance of the trigger system and its components in the proton-proton collisions during the 2022 commissioning period as well as its expected performance in proton-proton and heavy-ion collisions for the remainder of the third LHC data-taking period (2022–2025). 
    more » « less