skip to main content


Search for: All records

Creators/Authors contains: "Setti, F."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. null (Ed.)
  2. Abstract

    Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors.

     
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  3. Abstract

    This paper describes theCombinesoftware package used for statistical analyses by the CMS Collaboration. The package, originally designed to perform searches for a Higgs boson and the combined analysis of those searches, has evolved to become the statistical analysis tool presently used in the majority of measurements and searches performed by the CMS Collaboration. It is not specific to the CMS experiment, and this paper is intended to serve as a reference for users outside of the CMS Collaboration, providing an outline of the most salient features and capabilities. Readers are provided with the possibility to runCombineand reproduce examples provided in this paper using a publicly available container image. Since the package is constantly evolving to meet the demands of ever-increasing data sets and analysis sophistication, this paper cannot cover all details ofCombine. However, the online documentation referenced within this paper provides an up-to-date and complete user guide.

     
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  4. Free, publicly-accessible full text available November 1, 2025
  5. Abstract

    The CERN LHC provided proton and heavy ion collisions during its Run 2 operation period from 2015 to 2018. Proton-proton collisions reached a peak instantaneous luminosity of 2.1× 1034cm-2s-1, twice the initial design value, at √(s)=13 TeV. The CMS experiment records a subset of the collisions for further processing as part of its online selection of data for physics analyses, using a two-level trigger system: the Level-1 trigger, implemented in custom-designed electronics, and the high-level trigger, a streamlined version of the offline reconstruction software running on a large computer farm. This paper presents the performance of the CMS high-level trigger system during LHC Run 2 for physics objects, such as leptons, jets, and missing transverse momentum, which meet the broad needs of the CMS physics program and the challenge of the evolving LHC and detector conditions. Sophisticated algorithms that were originally used in offline reconstruction were deployed online. Highlights include a machine-learning b tagging algorithm and a reconstruction algorithm for tau leptons that decay hadronically.

     
    more » « less
    Free, publicly-accessible full text available November 1, 2025
  6. The first observation of the concurrent production of twoJ/ψmesons in proton-nucleus collisions is presented. The analysis is based on a proton-lead (pPb) data sample recorded at a nucleon-nucleon center-of-mass energy of 8.16 TeV by the CMS experiment at the CERN LHC and corresponding to an integrated luminosity of174.6nb1. The twoJ/ψmesons are reconstructed in theirμ+μdecay channels with transverse momentapT>6.5GeVand rapidity|y|<2.4. Events where one of theJ/ψmesons is reconstructed in the dielectron channel are also considered in the search. ThepPbJ/ψJ/ψ+Xprocess is observed with a significance of 5.3 standard deviations. The measured inclusive fiducial cross section, using the four-muon channel alone, isσ(pPbJ/ψJ/ψ+X)=22.0±8.9(stat)±1.5(syst)nb. A fit of the data to the expected rapidity separation for pairs ofJ/ψmesons produced in single (SPS) and double (DPS) parton scatterings yieldsσSPSpPbJ/ψJ/ψ+X=16.5±10.8(stat)±0.1(syst)nbandσDPSpPbJ/ψJ/ψ+X=5.4±6.2(stat)±0.4(syst)nb, respectively. This latter result can be transformed into a lower bound on the effective DPS cross section, closely related to the squared average interparton transverse separation in the collision, ofσeff>1.0mbat 95% confidence level.

    © 2024 CERN, for the CMS Collaboration2024CERN 
    more » « less
    Free, publicly-accessible full text available November 1, 2025
  7. The first search for soft unclustered energy patterns (SUEPs) is performed using an integrated luminosity of138fb1of proton-proton collision data ats=13TeV, collected in 2016–2018 by the CMS detector at the LHC. Such SUEPs are predicted by hidden valley models with a new, confining force with a large ’t Hooft coupling. In events with boosted topologies, selected by high-threshold hadronic triggers, the multiplicity and sphericity of clustered tracks are used to reject the background from standard model quantum chromodynamics. With no observed excess of events over the standard model expectation, limits are set on the cross section for production via gluon fusion of a scalar mediator with SUEP-like decays.

    © 2024 CERN, for the CMS Collaboration2024CERN 
    more » « less
    Free, publicly-accessible full text available November 1, 2025
  8. Free, publicly-accessible full text available October 1, 2025
  9. Free, publicly-accessible full text available October 1, 2025