Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract A 3-D dosimeter fills the need for treatment plan and delivery verification required by every modern radiation-therapy method used today. This report summarizes a proof-of-concept study to develop a water-equivalent solid 3-D dosimeter that is based on novel radiation-hard scintillating material. The active material of the prototype dosimeter is a blend of radiation-hard peroxide-cured polysiloxane plastic doped with scintillating agent P-Terphenyl and wavelength-shifter BisMSB. The prototype detector was tested with 6 MV and 10 MV x-ray beams at Ohio State University’s Comprehensive Cancer Center. A 3-D dose distribution was successfully reconstructed by a neural network specifically trained for this prototype. This report summarizes the material production procedure, the material’s water equivalency investigation, the design of the prototype dosimeter and its beam tests, as well as the details of the utilized machine learning approach and the reconstructed 3-D dose distributions.more » « less
-
ProtoDUNE Single-Phase (ProtoDUNE-SP) is a 770-ton liquid argon time projection chamber that operated in a hadron test beam at the CERN Neutrino Platform in 2018. We present a measurement of the total inelastic cross section of charged kaons on argon as a function of kaon energy using 6 andbeam momentum settings. The flux-weighted average of the extracted inelastic cross section at each beam momentum setting was measured to befor thesetting andfor thesetting.
Published by the American Physical Society 2024 Free, publicly-accessible full text available November 1, 2025 -
Abstract Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors.
Free, publicly-accessible full text available December 1, 2025 -
Abstract This paper describes the
Combine software package used for statistical analyses by the CMS Collaboration. The package, originally designed to perform searches for a Higgs boson and the combined analysis of those searches, has evolved to become the statistical analysis tool presently used in the majority of measurements and searches performed by the CMS Collaboration. It is not specific to the CMS experiment, and this paper is intended to serve as a reference for users outside of the CMS Collaboration, providing an outline of the most salient features and capabilities. Readers are provided with the possibility to runCombine and reproduce examples provided in this paper using a publicly available container image. Since the package is constantly evolving to meet the demands of ever-increasing data sets and analysis sophistication, this paper cannot cover all details ofCombine . However, the online documentation referenced within this paper provides an up-to-date and complete user guide.Free, publicly-accessible full text available December 1, 2025 -
Free, publicly-accessible full text available November 1, 2025
-
Abstract The CERN LHC provided proton and heavy ion collisions during its Run 2 operation period from 2015 to 2018. Proton-proton collisions reached a peak instantaneous luminosity of 2.1× 1034cm-2s-1, twice the initial design value, at √(
s )=13 TeV. The CMS experiment records a subset of the collisions for further processing as part of its online selection of data for physics analyses, using a two-level trigger system: the Level-1 trigger, implemented in custom-designed electronics, and the high-level trigger, a streamlined version of the offline reconstruction software running on a large computer farm. This paper presents the performance of the CMS high-level trigger system during LHC Run 2 for physics objects, such as leptons, jets, and missing transverse momentum, which meet the broad needs of the CMS physics program and the challenge of the evolving LHC and detector conditions. Sophisticated algorithms that were originally used in offline reconstruction were deployed online. Highlights include a machine-learning b tagging algorithm and a reconstruction algorithm for tau leptons that decay hadronically.Free, publicly-accessible full text available November 1, 2025 -
The first observation of the concurrent production of twomesons in proton-nucleus collisions is presented. The analysis is based on a proton-lead () data sample recorded at a nucleon-nucleon center-of-mass energy of 8.16 TeV by the CMS experiment at the CERN LHC and corresponding to an integrated luminosity of. The twomesons are reconstructed in theirdecay channels with transverse momentaand rapidity. Events where one of themesons is reconstructed in the dielectron channel are also considered in the search. Theprocess is observed with a significance of 5.3 standard deviations. The measured inclusive fiducial cross section, using the four-muon channel alone, is. A fit of the data to the expected rapidity separation for pairs ofmesons produced in single (SPS) and double (DPS) parton scatterings yieldsand, respectively. This latter result can be transformed into a lower bound on the effective DPS cross section, closely related to the squared average interparton transverse separation in the collision, ofat 95% confidence level.
© 2024 CERN, for the CMS Collaboration 2024 CERN Free, publicly-accessible full text available November 1, 2025