skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 10:00 PM to 12:00 PM ET on Tuesday, March 25 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Diaz, D."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. This conference paper provides an update on the Early Research Scholars Program (ERSP) background, structure, and implementation at the University of Illinois Chicago (UIC), developed at the University of California San Diego and funded by the National Science Foundation Improving Undergraduate STEM Education program. The program aims to support retention of students from marginalized backgrounds in the fields of computing as well as electrical and computer engineering. This paper provides program updates, including data from the 2022-2023 academic year and preliminary results from a reflection study that began in spring 2020. The reflection study examined the impact of the ERSP on a student's computing and engineering identity development based on student reflection responses. In this paper, we also discuss student demographics, retention rates, and changes made to the program's curriculum at UIC. The evaluation results from the last three years of the program are also shared, which show how students are impacted by the program, as well as areas for improvement. Preliminary results show that the program has positively impacted students' computing or engineering identity development for at least three identity dimensions: recognition, competence, and community. 
    more » « less
  2. We demonstrate the underlying mechanism for one version of quantum-enhanced telescopy, using multiple interconnected Hong-Ou-Mandel interferometers to re-cover the visibility amplitude of the source of light in the presence of arbitrary turbulence. 
    more » « less
  3. We demonstrate the underlying mechanism for quantum-enhanced telescopy, using multiple interconnected Hong-Ou-Mandel interferometers to recover the visibility amplitude and relative phase of the source light into multiple simulated telescopes. 
    more » « less
  4. Abstract Sea‐level rise and associated flood hazards pose severe risks to the millions of people globally living in coastal zones. Models representing coastal adaptation and impacts are important tools to inform the design of strategies to manage these risks. Representing the often deep uncertainties influencing these risks poses nontrivial challenges. A common uncertainty characterization approach is to use a few benchmark cases to represent the range and relative probabilities of the set of possible outcomes. This has been done in coastal adaptation studies, for example, by using low, moderate, and high percentiles of an input of interest, like sea‐level changes. A key consideration is how this simplified characterization of uncertainty influences the distributions of estimated coastal impacts. Here, we show that using only a few benchmark percentiles to represent uncertainty in future sea‐level change can lead to overconfident projections and underestimate high‐end risks as compared to using full ensembles for sea‐level change and socioeconomic parametric uncertainties. When uncertainty in future sea level is characterized by low, moderate, and high percentiles of global mean sea‐level rise, estimates of high‐end (95th percentile) damages are underestimated by between 18% (SSP1‐2.6) and 46% (SSP5‐8.5). Additionally, using the 5th and 95th percentiles of sea‐level scenarios underestimates the 5%–95% width of the distribution of adaptation costs by a factor ranging from about two to four, depending on SSP‐RCP pathway. The resulting underestimation of the uncertainty range in adaptation costs can bias adaptation and mitigation decision‐making. 
    more » « less
  5. Free, publicly-accessible full text available January 1, 2026
  6. The international collaboration designing and constructing the Deep Underground Neutrino Experiment (DUNE) at the Long-Baseline Neutrino Facility (LBNF) has developed a two-phase strategy toward the implementation of this leading-edge, large-scale science project. The 2023 report of the US Particle Physics Project Prioritization Panel (P5) reaffirmed this vision and strongly endorsed DUNE Phase I and Phase II, as did the European Strategy for Particle Physics. While the construction of the DUNE Phase I is well underway, this White Paper focuses on DUNE Phase II planning. DUNE Phase-II consists of a third and fourth far detector (FD) module, an upgraded near detector complex, and an enhanced 2.1 MW beam. The fourth FD module is conceived as a "Module of Opportunity", aimed at expanding the physics opportunities, in addition to supporting the core DUNE science program, with more advanced technologies. This document highlights the increased science opportunities offered by the DUNE Phase II near and far detectors, including long-baseline neutrino oscillation physics, neutrino astrophysics, and physics beyond the standard model. It describes the DUNE Phase II near and far detector technologies and detector design concepts that are currently under consideration. A summary of key R&D goals and prototyping phases needed to realize the Phase II detector technical designs is also provided. DUNE's Phase II detectors, along with the increased beam power, will complete the full scope of DUNE, enabling a multi-decadal program of groundbreaking science with neutrinos. 
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  7. A<sc>bstract</sc> A measurement is performed of Higgs bosons produced with high transverse momentum (pT) via vector boson or gluon fusion in proton-proton collisions. The result is based on a data set with a center-of-mass energy of 13 TeV collected in 2016–2018 with the CMS detector at the LHC and corresponds to an integrated luminosity of 138 fb−1. The decay of a high-pTHiggs boson to a boosted bottom quark-antiquark pair is selected using large-radius jets and employing jet substructure and heavy-flavor taggers based on machine learning techniques. Independent regions targeting the vector boson and gluon fusion mechanisms are defined based on the topology of two quark-initiated jets with large pseudorapidity separation. The signal strengths for both processes are extracted simultaneously by performing a maximum likelihood fit to data in the large-radius jet mass distribution. The observed signal strengths relative to the standard model expectation are$$ {4.9}_{-1.6}^{+1.9} $$ 4.9 1.6 + 1.9 and$$ {1.6}_{-1.5}^{+1.7} $$ 1.6 1.5 + 1.7 for the vector boson and gluon fusion mechanisms, respectively. A differential cross section measurement is also reported in the simplified template cross section framework. 
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  8. Abstract Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors. 
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  9. ProtoDUNE Single-Phase (ProtoDUNE-SP) is a 770-ton liquid argon time projection chamber that operated in a hadron test beam at the CERN Neutrino Platform in 2018. We present a measurement of the total inelastic cross section of charged kaons on argon as a function of kaon energy using 6 and 7 GeV / c beam momentum settings. The flux-weighted average of the extracted inelastic cross section at each beam momentum setting was measured to be 380 ± 26 mbarns for the 6 GeV / c setting and 379 ± 35 mbarns for the 7 GeV / c setting. Published by the American Physical Society2024 
    more » « less
    Free, publicly-accessible full text available November 1, 2025
  10. Abstract A search is reported for charge-parity$$CP$$ CP violation in$${{{\textrm{D}}}^{{0}}} \rightarrow {{\textrm{K}} _{\text {S}}^{{0}}} {{\textrm{K}} _{\text {S}}^{{0}}} $$ D 0 K S 0 K S 0 decays, using data collected in proton–proton collisions at$$\sqrt{s} = 13\,\text {Te}\hspace{-.08em}\text {V} $$ s = 13 Te V recorded by the CMS experiment in 2018. The analysis uses a dedicated data set that corresponds to an integrated luminosity of 41.6$$\,\text {fb}^{-1}$$ fb - 1 , which consists of about 10 billion events containing a pair of b hadrons, nearly all of which decay to charm hadrons. The flavor of the neutral D meson is determined by the pion charge in the reconstructed decays$${{{\textrm{D}}}^{{*+}}} \rightarrow {{{\textrm{D}}}^{{0}}} {{{\mathrm{\uppi }}}^{{+}}} $$ D + D 0 π + and$${{{\textrm{D}}}^{{*-}}} \rightarrow {\overline{{\textrm{D}}}^{{0}}} {{{\mathrm{\uppi }}}^{{-}}} $$ D - D ¯ 0 π - . The$$CP$$ CP asymmetry in$${{{\textrm{D}}}^{{0}}} \rightarrow {{\textrm{K}} _{\text {S}}^{{0}}} {{\textrm{K}} _{\text {S}}^{{0}}} $$ D 0 K S 0 K S 0 is measured to be$$A_{CP} ({{\textrm{K}} _{\text {S}}^{{0}}} {{\textrm{K}} _{\text {S}}^{{0}}} ) = (6.2 \pm 3.0 \pm 0.2 \pm 0.8)\%$$ A CP ( K S 0 K S 0 ) = ( 6.2 ± 3.0 ± 0.2 ± 0.8 ) % , where the three uncertainties represent the statistical uncertainty, the systematic uncertainty, and the uncertainty in the measurement of the$$CP$$ CP asymmetry in the$${{{\textrm{D}}}^{{0}}} \rightarrow {{\textrm{K}} _{\text {S}}^{{0}}} {{{\mathrm{\uppi }}}^{{+}}} {{{\mathrm{\uppi }}}^{{-}}} $$ D 0 K S 0 π + π - decay. This is the first$$CP$$ CP asymmetry measurement by CMS in the charm sector as well as the first to utilize a fully hadronic final state. 
    more » « less
    Free, publicly-accessible full text available December 1, 2025