skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Myers, LC"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. A<sc>bstract</sc> We present the first parton-level study of anomalous effects in triboson production in both fully and semi-leptonic channels in proton-proton collisions at 13 TeV at the Large Hadron Collider (LHC). The sensitivity to anomalies induced by a minimal set of bosonic dimension-6 operators from the Warsaw basis is evaluated with specific analyses for each final state. A likelihood-based strategy is employed to assess the most sensitive kinematic observables per channel, where the contribution of Effective Field Theory operators is parameterized at either the linear or quadratic level. The impact of the mutual interference terms of pairs of operators on the sensitivity is also examined. This benchmark study explores the complementarity and overlap in sensitivity between different triboson measurements and paves the way for future analyses at the LHC experiments. The statistical combination of the considered final states allows setting stringent bounds on five bosonic Wilson coefficients. 
    more » « less
  2. Abstract The CMS detector is a general-purpose apparatus that detects high-energy collisions produced at the LHC. Online data quality monitoring of the CMS electromagnetic calorimeter is a vital operational tool that allows detector experts to quickly identify, localize, and diagnose a broad range of detector issues that could affect the quality of physics data. A real-time autoencoder-based anomaly detection system using semi-supervised machine learning is presented enabling the detection of anomalies in the CMS electromagnetic calorimeter data. A novel method is introduced which maximizes the anomaly detection performance by exploiting the time-dependent evolution of anomalies as well as spatial variations in the detector response. The autoencoder-based system is able to efficiently detect anomalies, while maintaining a very low false discovery rate. The performance of the system is validated with anomalies found in 2018 and 2022 LHC collision data. In addition, the first results from deploying the autoencoder-based system in the CMS online data quality monitoring workflow during the beginning of Run 3 of the LHC are presented, showing its ability to detect issues missed by the existing system. 
    more » « less
  3. A<sc>bstract</sc> A search for beyond-the-standard-model neutral Higgs bosons decaying to a pair of bottom quarks, and produced in association with at least one additional bottom quark, is performed with the CMS detector. The data were recorded in proton-proton collisions at a centre-of-mass energy of 13 TeV at the CERN LHC and correspond to an integrated luminosity of 36.7–126.9 fb−1, depending on the probed mass range. No signal above the standard model background expectation is observed. Upper limits on the production cross section times branching fraction are set for Higgs bosons in the mass range of 125–1800 GeV. The results are interpreted in benchmark scenarios of the minimal supersymmetric standard model, as well as suitable classes of two-Higgs-doublet models. 
    more » « less
    Free, publicly-accessible full text available June 1, 2026
  4. A<sc>bstract</sc> A measurement is presented of the primary Lund jet plane (LJP) density in inclusive jet production in proton-proton collisions. The analysis uses 138 fb−1of data collected by the CMS experiment at$$ \sqrt{s} $$ s = 13 TeV. The LJP, a representation of the phase space of emissions inside jets, is constructed using iterative jet declustering. The transverse momentumkTand the splitting angle ∆Rof an emission relative to its emitter are measured at each step of the jet declustering process. The average density of emissions as function of ln(kT/GeV) and ln(R/∆R) is measured for jets with distance parametersR= 0.4 or 0.8, transverse momentumpT>700 GeV, and rapidity |y|<1.7. The jet substructure is measured using the charged-particle tracks of the jet. The measured distributions, unfolded to the level of stable charged particles, are compared with theoretical predictions from simulations and with perturbative quantum chromodynamics calculations. Due to the ability of the LJP to factorize physical effects, these measurements can be used to improve different aspects of the physics modeling in event generators. 
    more » « less
  5. A<sc>bstract</sc> A search for new physics in top quark production with additional final-state leptons is performed using data collected by the CMS experiment in proton-proton collisions at$$ \sqrt{s} $$ s = 13 TeV at the LHC during 2016–2018. The data set corresponds to an integrated luminosity of 138 fb−1. Using the framework of effective field theory (EFT), potential new physics effects are parametrized in terms of 26 dimension-six EFT operators. The impacts of EFT operators are incorporated through the event-level reweighting of Monte Carlo simulations, which allows for detector-level predictions. The events are divided into several categories based on lepton multiplicity, total lepton charge, jet multiplicity, and b-tagged jet multiplicity. Kinematic variables corresponding to the transverse momentum (pT) of the leading pair of leptons and/or jets as well as thepTof on-shell Z bosons are used to extract the 95% confidence intervals of the 26 Wilson coefficients corresponding to these EFT operators. No significant deviation with respect to the standard model prediction is found. 
    more » « less
  6. A search for electroweak production of a single vectorlike T quark in association with a bottom ( b ) quark in the all-hadronic decay channel is presented. This search uses proton-proton collision data at s = 13 TeV collected by the CMS experiment at the CERN LHC during 2016–2018, corresponding to an integrated luminosity of 138 fb 1 . The T quark is assumed to have charge 2 / 3 and decay to a top ( t ) quark and a Higgs ( H ) or Z boson. Hadronic decays of the t quark and the H or Z boson are reconstructed from the kinematic properties of jets, including those containing b hadrons. No deviation from the standard model prediction is observed in the reconstructed t H and t Z invariant mass distributions. The 95% confidence level upper limits on the product of the production cross section and branching fraction of a T quark produced in association with a b quark and decaying via t H or t Z range from 1260 to 68 fb for T quark masses of 600–1200 GeV. © 2024 CERN, for the CMS Collaboration2024CERN 
    more » « less
  7. De_Vita, R; Espinal, X; Laycock, P; Shadura, O (Ed.)
    The large data volumes expected from the High Luminosity LHC (HL-LHC) present challenges to existing paradigms and facilities for end-user data analysis. Modern cyberinfrastructure tools provide a diverse set of services that can be composed into a system that provides physicists with powerful tools that give them straightforward access to large computing resources, with low barriers to entry. The Coffea-Casa analysis facility (AF) provides an environment for end users enabling the execution of increasingly complex analyses such as those demonstrated by the Analysis Grand Challenge (AGC) and capturing the features that physicists will need for the HL-LHC. We describe the development progress of the Coffea-Casa facility featuring its modularity while demonstrating the ability to port and customize the facility software stack to other locations. The facility also facilitates the support of batch systems while staying Kubernetes-native. We present the evolved architecture of the facility, such as the integration of advanced data delivery services (e.g. ServiceX) and making data caching services (e.g. XCache) available to end users of the facility. We also highlight the composability of modern cyberinfrastructure tools. To enable machine learning pipelines at coffee-casa analysis facilities, a set of industry ML solutions adopted for HEP columnar analysis were integrated on top of existing facility services. These services also feature transparent access for user workflows to GPUs available at a facility via inference servers while using Kubernetes as enabling technology. 
    more » « less