Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
null (Ed.)During high-speed rear impacts with delta-V > 25 km/h, the front seats may rotate rearward due to occupant and seat momentum change leading to possibly large seat deflection. One possible way of limiting this may be by introducing a structure that would restrict large rotations or deformations, however, such a structure would change the front seat occupant kinematics and kinetics. The goal of this study was to understand the influence of seat back restriction on head, neck and torso responses of front seat occupants when subjected to a moderate speed rear-impact. This was done by simulating a rear impact scenario with a delta-V of 37.4 km/h using LS-Dyna, with the GHBMC M50 occupant model and a manufacturer provided seat model. The study included two parts, the first part was to identify worst case scenarios using the simplified GHBMC M50-OS, and the second part was to further investigate the identified scenarios using the detailed GHBMC M50-O. The baseline condition included running the belted GHBMC on the seat at the specified pulse. This was followed by including a seatback constraint, a restriction bar, at 65 mm from the seat back to restrict rearward movement. Four different scenarios were investigated using the GHBMC M50-OS for the first part of the study both in the baseline and inclusion of a restriction bar behind the seatback: occupant seated normally; occupant offset on the seat; occupant rotated on the seat; and occupant seated normally but at a slightly oblique rear impact direction. The oblique condition was identified as the worst-case scenario based on the inter-vertebral kinematics; therefore, this condition was further investigated in the simulations with GHBMC M50-O. In the oblique rear impact scenario, the head missed the head restraint leading to inter-vertebral rotations exceeding the physiological range of motions regardless of the restriction bar use. However, adding a restriction bar behind the seat back showed a higher HIC and BrIC in both normal and oblique pulses due to the sudden stop, although the magnitudes were below the threshold.more » « less
-
Abstract Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors.
Free, publicly-accessible full text available December 1, 2025 -
Abstract This paper describes the
Combine software package used for statistical analyses by the CMS Collaboration. The package, originally designed to perform searches for a Higgs boson and the combined analysis of those searches, has evolved to become the statistical analysis tool presently used in the majority of measurements and searches performed by the CMS Collaboration. It is not specific to the CMS experiment, and this paper is intended to serve as a reference for users outside of the CMS Collaboration, providing an outline of the most salient features and capabilities. Readers are provided with the possibility to runCombine and reproduce examples provided in this paper using a publicly available container image. Since the package is constantly evolving to meet the demands of ever-increasing data sets and analysis sophistication, this paper cannot cover all details ofCombine . However, the online documentation referenced within this paper provides an up-to-date and complete user guide.Free, publicly-accessible full text available December 1, 2025 -
Free, publicly-accessible full text available November 1, 2025
-
Abstract The CERN LHC provided proton and heavy ion collisions during its Run 2 operation period from 2015 to 2018. Proton-proton collisions reached a peak instantaneous luminosity of 2.1× 1034cm-2s-1, twice the initial design value, at √(
s )=13 TeV. The CMS experiment records a subset of the collisions for further processing as part of its online selection of data for physics analyses, using a two-level trigger system: the Level-1 trigger, implemented in custom-designed electronics, and the high-level trigger, a streamlined version of the offline reconstruction software running on a large computer farm. This paper presents the performance of the CMS high-level trigger system during LHC Run 2 for physics objects, such as leptons, jets, and missing transverse momentum, which meet the broad needs of the CMS physics program and the challenge of the evolving LHC and detector conditions. Sophisticated algorithms that were originally used in offline reconstruction were deployed online. Highlights include a machine-learning b tagging algorithm and a reconstruction algorithm for tau leptons that decay hadronically.Free, publicly-accessible full text available November 1, 2025 -
The first observation of the concurrent production of twomesons in proton-nucleus collisions is presented. The analysis is based on a proton-lead () data sample recorded at a nucleon-nucleon center-of-mass energy of 8.16 TeV by the CMS experiment at the CERN LHC and corresponding to an integrated luminosity of. The twomesons are reconstructed in theirdecay channels with transverse momentaand rapidity. Events where one of themesons is reconstructed in the dielectron channel are also considered in the search. Theprocess is observed with a significance of 5.3 standard deviations. The measured inclusive fiducial cross section, using the four-muon channel alone, is. A fit of the data to the expected rapidity separation for pairs ofmesons produced in single (SPS) and double (DPS) parton scatterings yieldsand, respectively. This latter result can be transformed into a lower bound on the effective DPS cross section, closely related to the squared average interparton transverse separation in the collision, ofat 95% confidence level.
© 2024 CERN, for the CMS Collaboration 2024 CERN Free, publicly-accessible full text available November 1, 2025 -
The first search for soft unclustered energy patterns (SUEPs) is performed using an integrated luminosity ofof proton-proton collision data at, collected in 2016–2018 by the CMS detector at the LHC. Such SUEPs are predicted by hidden valley models with a new, confining force with a large ’t Hooft coupling. In events with boosted topologies, selected by high-threshold hadronic triggers, the multiplicity and sphericity of clustered tracks are used to reject the background from standard model quantum chromodynamics. With no observed excess of events over the standard model expectation, limits are set on the cross section for production via gluon fusion of a scalar mediator with SUEP-like decays.
© 2024 CERN, for the CMS Collaboration 2024 CERN Free, publicly-accessible full text available November 1, 2025 -
A bstract A comprehensive study of the local and nonlocal amplitudes contributing to the decay
B 0→K *0(→K +π − )μ +μ − is performed by analysing the phase-space distribution of the decay products. The analysis is based onpp collision data corresponding to an integrated luminosity of 8.4 fb− 1collected by the LHCb experiment. This measurement employs for the first time a model of both one-particle and two-particle nonlocal amplitudes, and utilises the complete dimuon mass spectrum without any veto regions around the narrow charmonium resonances. In this way it is possible to explicitly isolate the local and nonlocal contributions and capture the interference between them. The results show that interference with nonlocal contributions, although larger than predicted, only has a minor impact on the Wilson Coefficients determined from the fit to the data. For the local contributions, the Wilson Coefficient , responsible for vector dimuon currents, exhibits a 2.1$$ {\mathcal{C}}_9 $$ σ deviation from the Standard Model expectation. The Wilson Coefficients ,$$ {\mathcal{C}}_{10} $$ and$$ {\mathcal{C}}_9^{\prime } $$ are all in better agreement than$$ {\mathcal{C}}_{10}^{\prime } $$ with the Standard Model and the global significance is at the level of 1.5$$ {\mathcal{C}}_9 $$ σ . The model used also accounts for nonlocal contributions fromB 0→ K *0[τ +τ − → μ +μ − ] rescattering, resulting in the first direct measurement of thebsττ vector effective-coupling .$$ {\mathcal{C}}_{9\tau } $$ Free, publicly-accessible full text available September 1, 2025 -
Free, publicly-accessible full text available October 1, 2025