skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Friday, December 13 until 2:00 AM ET on Saturday, December 14 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Onel, Y."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract A 3-D dosimeter fills the need for treatment plan and delivery verification required by every modern radiation-therapy method used today. This report summarizes a proof-of-concept study to develop a water-equivalent solid 3-D dosimeter that is based on novel radiation-hard scintillating material. The active material of the prototype dosimeter is a blend of radiation-hard peroxide-cured polysiloxane plastic doped with scintillating agent P-Terphenyl and wavelength-shifter BisMSB. The prototype detector was tested with 6 MV and 10 MV x-ray beams at Ohio State University’s Comprehensive Cancer Center. A 3-D dose distribution was successfully reconstructed by a neural network specifically trained for this prototype. This report summarizes the material production procedure, the material’s water equivalency investigation, the design of the prototype dosimeter and its beam tests, as well as the details of the utilized machine learning approach and the reconstructed 3-D dose distributions. 
    more » « less
  2. null (Ed.)
  3. ProtoDUNE Single-Phase (ProtoDUNE-SP) is a 770-ton liquid argon time projection chamber that operated in a hadron test beam at the CERN Neutrino Platform in 2018. We present a measurement of the total inelastic cross section of charged kaons on argon as a function of kaon energy using 6 and7GeV/cbeam momentum settings. The flux-weighted average of the extracted inelastic cross section at each beam momentum setting was measured to be380±26mbarnsfor the6GeV/csetting and379±35mbarnsfor the7GeV/csetting.

    Published by the American Physical Society2024 
    more » « less
    Free, publicly-accessible full text available November 1, 2025
  4. Abstract

    Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors.

     
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  5. Abstract

    This paper describes theCombinesoftware package used for statistical analyses by the CMS Collaboration. The package, originally designed to perform searches for a Higgs boson and the combined analysis of those searches, has evolved to become the statistical analysis tool presently used in the majority of measurements and searches performed by the CMS Collaboration. It is not specific to the CMS experiment, and this paper is intended to serve as a reference for users outside of the CMS Collaboration, providing an outline of the most salient features and capabilities. Readers are provided with the possibility to runCombineand reproduce examples provided in this paper using a publicly available container image. Since the package is constantly evolving to meet the demands of ever-increasing data sets and analysis sophistication, this paper cannot cover all details ofCombine. However, the online documentation referenced within this paper provides an up-to-date and complete user guide.

     
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  6. Free, publicly-accessible full text available November 1, 2025
  7. Abstract

    The CERN LHC provided proton and heavy ion collisions during its Run 2 operation period from 2015 to 2018. Proton-proton collisions reached a peak instantaneous luminosity of 2.1× 1034cm-2s-1, twice the initial design value, at √(s)=13 TeV. The CMS experiment records a subset of the collisions for further processing as part of its online selection of data for physics analyses, using a two-level trigger system: the Level-1 trigger, implemented in custom-designed electronics, and the high-level trigger, a streamlined version of the offline reconstruction software running on a large computer farm. This paper presents the performance of the CMS high-level trigger system during LHC Run 2 for physics objects, such as leptons, jets, and missing transverse momentum, which meet the broad needs of the CMS physics program and the challenge of the evolving LHC and detector conditions. Sophisticated algorithms that were originally used in offline reconstruction were deployed online. Highlights include a machine-learning b tagging algorithm and a reconstruction algorithm for tau leptons that decay hadronically.

     
    more » « less
    Free, publicly-accessible full text available November 1, 2025
  8. The first observation of the concurrent production of twoJ/ψmesons in proton-nucleus collisions is presented. The analysis is based on a proton-lead (pPb) data sample recorded at a nucleon-nucleon center-of-mass energy of 8.16 TeV by the CMS experiment at the CERN LHC and corresponding to an integrated luminosity of174.6nb1. The twoJ/ψmesons are reconstructed in theirμ+μdecay channels with transverse momentapT>6.5GeVand rapidity|y|<2.4. Events where one of theJ/ψmesons is reconstructed in the dielectron channel are also considered in the search. ThepPbJ/ψJ/ψ+Xprocess is observed with a significance of 5.3 standard deviations. The measured inclusive fiducial cross section, using the four-muon channel alone, isσ(pPbJ/ψJ/ψ+X)=22.0±8.9(stat)±1.5(syst)nb. A fit of the data to the expected rapidity separation for pairs ofJ/ψmesons produced in single (SPS) and double (DPS) parton scatterings yieldsσSPSpPbJ/ψJ/ψ+X=16.5±10.8(stat)±0.1(syst)nbandσDPSpPbJ/ψJ/ψ+X=5.4±6.2(stat)±0.4(syst)nb, respectively. This latter result can be transformed into a lower bound on the effective DPS cross section, closely related to the squared average interparton transverse separation in the collision, ofσeff>1.0mbat 95% confidence level.

    © 2024 CERN, for the CMS Collaboration2024CERN 
    more » « less
    Free, publicly-accessible full text available November 1, 2025