skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 8:00 PM ET on Friday, March 21 until 8:00 AM ET on Saturday, March 22 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Chaudhary, G."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. The presence of incomplete cuts in a thin planar sheet can dramatically alter its mechanical and geometrical response to loading, as the cuts allow the sheet to deform strongly in the third dimension, most beautifully demonstrated in kirigami art-forms. We use numerical experiments to characterize the geometric mechanics of kirigamized sheets as a function of the number, size and orientation of cuts. We show that the geometry of mechanically loaded sheets can be approximated as a composition of simple developable units: flats, cylinders, cones and compressed Elasticae. This geometric construction yields scaling laws for the mechanical response of the sheet in both the weak and strongly deformed limit. In the ultimately stretched limit, this further leads to a theorem on the nature and form of geodesics in an arbitrary kirigami pattern, consistent with observations and simulations. Finally, we show that by varying the shape and size of the geodesic in a kirigamized sheet, we can control the deployment trajectory of the sheet, and thence its functional properties as an exemplar of a tunable structure that can serve as a robotic gripper, a soft light window or the basis for a physically unclonable device. Overall our study of disordered kirigami sets the stage for controlling the shape and shielding the stresses in thin sheets using cuts. 
    more » « less
  2. Abstract

    The CMS detector is a general-purpose apparatus that detects high-energy collisions produced at the LHC. Online data quality monitoring of the CMS electromagnetic calorimeter is a vital operational tool that allows detector experts to quickly identify, localize, and diagnose a broad range of detector issues that could affect the quality of physics data. A real-time autoencoder-based anomaly detection system using semi-supervised machine learning is presented enabling the detection of anomalies in the CMS electromagnetic calorimeter data. A novel method is introduced which maximizes the anomaly detection performance by exploiting the time-dependent evolution of anomalies as well as spatial variations in the detector response. The autoencoder-based system is able to efficiently detect anomalies, while maintaining a very low false discovery rate. The performance of the system is validated with anomalies found in 2018 and 2022 LHC collision data. In addition, the first results from deploying the autoencoder-based system in the CMS online data quality monitoring workflow during the beginning of Run 3 of the LHC are presented, showing its ability to detect issues missed by the existing system.

     
    more » « less
    Free, publicly-accessible full text available June 24, 2025
  3. Free, publicly-accessible full text available January 1, 2026
  4. Abstract

    Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors.

     
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  5. A<sc>bstract</sc>

    A measurement is performed of Higgs bosons produced with high transverse momentum (pT) via vector boson or gluon fusion in proton-proton collisions. The result is based on a data set with a center-of-mass energy of 13 TeV collected in 2016–2018 with the CMS detector at the LHC and corresponds to an integrated luminosity of 138 fb1. The decay of a high-pTHiggs boson to a boosted bottom quark-antiquark pair is selected using large-radius jets and employing jet substructure and heavy-flavor taggers based on machine learning techniques. Independent regions targeting the vector boson and gluon fusion mechanisms are defined based on the topology of two quark-initiated jets with large pseudorapidity separation. The signal strengths for both processes are extracted simultaneously by performing a maximum likelihood fit to data in the large-radius jet mass distribution. The observed signal strengths relative to the standard model expectation are$$ {4.9}_{-1.6}^{+1.9} $$4.91.6+1.9and$$ {1.6}_{-1.5}^{+1.7} $$1.61.5+1.7for the vector boson and gluon fusion mechanisms, respectively. A differential cross section measurement is also reported in the simplified template cross section framework.

     
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  6. Abstract

    This paper describes theCombinesoftware package used for statistical analyses by the CMS Collaboration. The package, originally designed to perform searches for a Higgs boson and the combined analysis of those searches, has evolved to become the statistical analysis tool presently used in the majority of measurements and searches performed by the CMS Collaboration. It is not specific to the CMS experiment, and this paper is intended to serve as a reference for users outside of the CMS Collaboration, providing an outline of the most salient features and capabilities. Readers are provided with the possibility to runCombineand reproduce examples provided in this paper using a publicly available container image. Since the package is constantly evolving to meet the demands of ever-increasing data sets and analysis sophistication, this paper cannot cover all details ofCombine. However, the online documentation referenced within this paper provides an up-to-date and complete user guide.

     
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  7. Abstract

    A search is reported for charge-parity$$CP$$CPviolation in$${{{\textrm{D}}}^{{0}}} \rightarrow {{\textrm{K}} _{\text {S}}^{{0}}} {{\textrm{K}} _{\text {S}}^{{0}}} $$D0KS0KS0decays, using data collected in proton–proton collisions at$$\sqrt{s} = 13\,\text {Te}\hspace{-.08em}\text {V} $$s=13TeVrecorded by the CMS experiment in 2018. The analysis uses a dedicated data set that corresponds to an integrated luminosity of 41.6$$\,\text {fb}^{-1}$$fb-1, which consists of about 10 billion events containing a pair of b hadrons, nearly all of which decay to charm hadrons. The flavor of the neutral D meson is determined by the pion charge in the reconstructed decays$${{{\textrm{D}}}^{{*+}}} \rightarrow {{{\textrm{D}}}^{{0}}} {{{\mathrm{\uppi }}}^{{+}}} $$D+D0π+and$${{{\textrm{D}}}^{{*-}}} \rightarrow {\overline{{\textrm{D}}}^{{0}}} {{{\mathrm{\uppi }}}^{{-}}} $$D-D¯0π-. The$$CP$$CPasymmetry in$${{{\textrm{D}}}^{{0}}} \rightarrow {{\textrm{K}} _{\text {S}}^{{0}}} {{\textrm{K}} _{\text {S}}^{{0}}} $$D0KS0KS0is measured to be$$A_{CP} ({{\textrm{K}} _{\text {S}}^{{0}}} {{\textrm{K}} _{\text {S}}^{{0}}} ) = (6.2 \pm 3.0 \pm 0.2 \pm 0.8)\%$$ACP(KS0KS0)=(6.2±3.0±0.2±0.8)%, where the three uncertainties represent the statistical uncertainty, the systematic uncertainty, and the uncertainty in the measurement of the$$CP$$CPasymmetry in the$${{{\textrm{D}}}^{{0}}} \rightarrow {{\textrm{K}} _{\text {S}}^{{0}}} {{{\mathrm{\uppi }}}^{{+}}} {{{\mathrm{\uppi }}}^{{-}}} $$D0KS0π+π-decay. This is the first$$CP$$CPasymmetry measurement by CMS in the charm sector as well as the first to utilize a fully hadronic final state.

     
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  8. Abstract

    The CERN LHC provided proton and heavy ion collisions during its Run 2 operation period from 2015 to 2018. Proton-proton collisions reached a peak instantaneous luminosity of 2.1× 1034cm-2s-1, twice the initial design value, at √(s)=13 TeV. The CMS experiment records a subset of the collisions for further processing as part of its online selection of data for physics analyses, using a two-level trigger system: the Level-1 trigger, implemented in custom-designed electronics, and the high-level trigger, a streamlined version of the offline reconstruction software running on a large computer farm. This paper presents the performance of the CMS high-level trigger system during LHC Run 2 for physics objects, such as leptons, jets, and missing transverse momentum, which meet the broad needs of the CMS physics program and the challenge of the evolving LHC and detector conditions. Sophisticated algorithms that were originally used in offline reconstruction were deployed online. Highlights include a machine-learning b tagging algorithm and a reconstruction algorithm for tau leptons that decay hadronically.

     
    more » « less
    Free, publicly-accessible full text available November 1, 2025
  9. Free, publicly-accessible full text available November 1, 2025