Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Ivey, C; Hobbs, S; Patterson, R; Rice-Boayue, J (Ed.)The pressing issue of pesticide exposure disproportionately affecting marginalized communities underscores the immediate necessity to tackle pesticide drift from nearby agricultural areas, especially aggravated by the impacts of climate change. Effective measures including stricter regulations, enhanced monitoring, alternative agricultural practices, and community engagement are essential to mitigate environmental injustices and safeguard community health. This article delves into the intricate relationship between pesticide transport, groundwater vulnerability, and environmental justice within the context of climate change. Employing a geospatial analytical hierarchy overlay model, we comprehensively assess the impact of pesticide transport on groundwater vulnerability while scrutinizing climate change and associated environmental justice concerns. Groundwater vulnerability across the Kentucky River Basin varies, with 18% classified as very low, 23% as low, 27% as prone, and 20% and 12% as high and very high, respectively, concentrated mainly in the mid-eastern and southern regions due to population density and biodiversity. The research integrates a robust analytical detection technique, with a focus on glyphosate and its metabolites concentrations, to validate and refine spatial models. By engaging with communities, this study enhances understanding of environmental complexities, offering insights for sustainable environmental management.more » « lessFree, publicly-accessible full text available August 2, 2025
-
Free, publicly-accessible full text available July 26, 2025
-
This paper investigates the weaknesses of image watermarking techniques. We present WAVES (Watermark Analysis Via Enhanced Stress-testing), a novel benchmark for assessing watermark robustness, overcoming the limitations of current evaluation methods.WAVES integrates detection and identification tasks, and establishes a standardized evaluation protocol comprised of a diverse range of stress tests. The attacks in WAVES range from traditional image distortions to advanced and novel variations of adversarial, diffusive, and embedding-based attacks. We introduce a normalized score of attack potency which incorporates several widely used image quality metrics and allows us to produce of an ordered ranking of attacks. Our comprehensive evaluation over reveals previously undetected vulnerabilities of several modern watermarking algorithms. WAVES is envisioned as a toolkit for the future development of robust watermarking systems.more » « less
-
Datasets involving multivariate event streams are prevalent in numerous applications. We present a novel framework for modeling temporal point processes called clock logic neural networks (CLNN) which learn weighted clock logic (wCL) formulas as interpretable temporal rules by which some events promote or inhibit other events. Specifically, CLNN models temporal relations between events using conditional intensity rates informed by a set of wCL formulas, which are more expressive than related prior work. Unlike conventional approaches of searching for generative rules through expensive combinatorial optimization, we design smooth activation functions for components of wCL formulas that enable a continuous relaxation of the discrete search space and efficient learning of wCL formulas using gradient-based methods. Experiments on synthetic datasets manifest our model's ability to recover the ground-truth rules and improve computational efficiency. In addition, experiments on real-world datasets show that our models perform competitively when compared with state-of-the-art models.more » « less
-
Free, publicly-accessible full text available January 1, 2026
-
A<sc>bstract</sc> A measurement is performed of Higgs bosons produced with high transverse momentum (pT) via vector boson or gluon fusion in proton-proton collisions. The result is based on a data set with a center-of-mass energy of 13 TeV collected in 2016–2018 with the CMS detector at the LHC and corresponds to an integrated luminosity of 138 fb−1. The decay of a high-pTHiggs boson to a boosted bottom quark-antiquark pair is selected using large-radius jets and employing jet substructure and heavy-flavor taggers based on machine learning techniques. Independent regions targeting the vector boson and gluon fusion mechanisms are defined based on the topology of two quark-initiated jets with large pseudorapidity separation. The signal strengths for both processes are extracted simultaneously by performing a maximum likelihood fit to data in the large-radius jet mass distribution. The observed signal strengths relative to the standard model expectation are$$ {4.9}_{-1.6}^{+1.9} $$ and$$ {1.6}_{-1.5}^{+1.7} $$ for the vector boson and gluon fusion mechanisms, respectively. A differential cross section measurement is also reported in the simplified template cross section framework.more » « lessFree, publicly-accessible full text available December 1, 2025
-
Abstract Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors.more » « lessFree, publicly-accessible full text available December 1, 2025
-
Abstract A search is reported for charge-parity$$CP$$ violation in$${{{\textrm{D}}}^{{0}}} \rightarrow {{\textrm{K}} _{\text {S}}^{{0}}} {{\textrm{K}} _{\text {S}}^{{0}}} $$ decays, using data collected in proton–proton collisions at$$\sqrt{s} = 13\,\text {Te}\hspace{-.08em}\text {V} $$ recorded by the CMS experiment in 2018. The analysis uses a dedicated data set that corresponds to an integrated luminosity of 41.6$$\,\text {fb}^{-1}$$ , which consists of about 10 billion events containing a pair of b hadrons, nearly all of which decay to charm hadrons. The flavor of the neutral D meson is determined by the pion charge in the reconstructed decays$${{{\textrm{D}}}^{{*+}}} \rightarrow {{{\textrm{D}}}^{{0}}} {{{\mathrm{\uppi }}}^{{+}}} $$ and$${{{\textrm{D}}}^{{*-}}} \rightarrow {\overline{{\textrm{D}}}^{{0}}} {{{\mathrm{\uppi }}}^{{-}}} $$ . The$$CP$$ asymmetry in$${{{\textrm{D}}}^{{0}}} \rightarrow {{\textrm{K}} _{\text {S}}^{{0}}} {{\textrm{K}} _{\text {S}}^{{0}}} $$ is measured to be$$A_{CP} ({{\textrm{K}} _{\text {S}}^{{0}}} {{\textrm{K}} _{\text {S}}^{{0}}} ) = (6.2 \pm 3.0 \pm 0.2 \pm 0.8)\%$$ , where the three uncertainties represent the statistical uncertainty, the systematic uncertainty, and the uncertainty in the measurement of the$$CP$$ asymmetry in the$${{{\textrm{D}}}^{{0}}} \rightarrow {{\textrm{K}} _{\text {S}}^{{0}}} {{{\mathrm{\uppi }}}^{{+}}} {{{\mathrm{\uppi }}}^{{-}}} $$ decay. This is the first$$CP$$ asymmetry measurement by CMS in the charm sector as well as the first to utilize a fully hadronic final state.more » « lessFree, publicly-accessible full text available December 1, 2025
-
Abstract This paper describes theCombinesoftware package used for statistical analyses by the CMS Collaboration. The package, originally designed to perform searches for a Higgs boson and the combined analysis of those searches, has evolved to become the statistical analysis tool presently used in the majority of measurements and searches performed by the CMS Collaboration. It is not specific to the CMS experiment, and this paper is intended to serve as a reference for users outside of the CMS Collaboration, providing an outline of the most salient features and capabilities. Readers are provided with the possibility to runCombineand reproduce examples provided in this paper using a publicly available container image. Since the package is constantly evolving to meet the demands of ever-increasing data sets and analysis sophistication, this paper cannot cover all details ofCombine. However, the online documentation referenced within this paper provides an up-to-date and complete user guide.more » « lessFree, publicly-accessible full text available December 1, 2025