Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
null (Ed.)We introduce the idea of Citizen Scientist Amplification applying the method to data gathered from the top 10 contributing citizen scientists on the Supernova Hunters project. We take a novel approach to avail of the complementary strengths of deep learning and citizen science achieving results that are competitive with experts.more » « less
-
null (Ed.)In 2017, the Muon Hunter project on the Zooniverse.org citizen science platform successfully gathered more than two million classification labels for nearly 140,000 camera images from VER- ITAS. The aim was to select and parameterize muon events for use in training convolutional neural networks. The success of this project proved that crowdsourcing labels for IACT image analy- sis is a viable avenue for further development of advanced machine-learning algorithms. These algorithms could potentially lend themselves to improving class separation between gamma-ray and hadronic event types. Nonetheless, it took two months to gather these labels from volun- teers, which could be a bottleneck for future applications of this method. Here we present Muon Hunters 2.0: the follow-on project that demonstrates the development of unsupervised clustering techniques to gather muon labels more efficiently from volunteer classifiers.more » « less
-
Abstract Storm direction modulates a hydrograph's magnitude and duration, thus having a potentially large effect on local flood risk. However, how changes in the preferential storm direction affect the probability distribution of peak flows remains unknown. We address this question with a novel Monte Carlo approach where stochastically transposed storms drive hydrologic simulations over medium and mesoscale watersheds in the Midwestern United States. Systematic rotations of these watersheds are used to emulate changes in the preferential storm direction. We found that the peak flow distribution impacts are scale‐dependent, with larger changes observed in the mesoscale watershed than in the medium‐scale watershed. We attribute this to the high diversity of storm patterns and the storms' scale relative to watershed size. This study highlights the potential of the proposed stochastic framework to address fundamental questions about hydrologic extremes when our ability to observe these events in nature is hindered by technical constraints and short time records.more » « less
-
A search is presented for an extended Higgs sector with two new particles, and , in the process . Novel neural networks classify events with diphotons that are merged and determine the diphoton masses. The search uses LHC proton-proton collision data at collected with the CMS detector, corresponding to an integrated luminosity of . No evidence of such resonances is seen. Upper limits are set on the production cross section for between 300 and 3000 GeV and between 0.5% and 2.5%, representing the most sensitive search in this channel. © 2025 CERN, for the CMS Collaboration2025CERNmore » « lessFree, publicly-accessible full text available January 1, 2026
-
A<sc>bstract</sc> A measurement is performed of Higgs bosons produced with high transverse momentum (pT) via vector boson or gluon fusion in proton-proton collisions. The result is based on a data set with a center-of-mass energy of 13 TeV collected in 2016–2018 with the CMS detector at the LHC and corresponds to an integrated luminosity of 138 fb−1. The decay of a high-pTHiggs boson to a boosted bottom quark-antiquark pair is selected using large-radius jets and employing jet substructure and heavy-flavor taggers based on machine learning techniques. Independent regions targeting the vector boson and gluon fusion mechanisms are defined based on the topology of two quark-initiated jets with large pseudorapidity separation. The signal strengths for both processes are extracted simultaneously by performing a maximum likelihood fit to data in the large-radius jet mass distribution. The observed signal strengths relative to the standard model expectation are$$ {4.9}_{-1.6}^{+1.9} $$ and$$ {1.6}_{-1.5}^{+1.7} $$ for the vector boson and gluon fusion mechanisms, respectively. A differential cross section measurement is also reported in the simplified template cross section framework.more » « lessFree, publicly-accessible full text available December 1, 2025
-
Abstract Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors.more » « lessFree, publicly-accessible full text available December 1, 2025
-
Free, publicly-accessible full text available October 1, 2025
-
Free, publicly-accessible full text available October 1, 2025
-
Free, publicly-accessible full text available October 1, 2025
-
A<sc>bstract</sc> A search for Higgs boson pair (HH) production in association with a vector boson V (W or Z boson) is presented. The search is based on proton-proton collision data at a center-of-mass energy of 13 TeV, collected with the CMS detector at the LHC, corresponding to an integrated luminosity of 138 fb−1. Both hadronic and leptonic decays of V bosons are used. The leptons considered are electrons, muons, and neutrinos. The HH production is searched for in the$$ \textrm{b}\overline{\textrm{b}}\textrm{b}\overline{\textrm{b}} $$ decay channel. An observed (expected) upper limit at 95% confidence level of VHH production cross section is set at 294 (124) times the standard model prediction. Constraints are also set on the modifiers of the Higgs boson trilinear self-coupling,kλ, assumingk2V= 1, and vice versa on the coupling of two Higgs bosons with two vector bosons,k2V. The observed (expected) 95% confidence intervals of these coupling modifiers are−37.7 <kλ< 37.2 (−30.1 <kλ< 28.9) and−12.2 <k2V< 13.5 (−7.2 <k2V< 8.9), respectively.more » « lessFree, publicly-accessible full text available October 1, 2025