Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available October 13, 2025
-
The development and measurable improvements in performance of large language models on natural language tasks opens the opportunity to utilize large language models in an educational setting to replicate human tutoring, which is often costly and inaccessible. We are particularly interested in large language models from the GPT series, created by OpenAI. In the original study we found that the quality of explanations generated with GPT-3.5 was poor, where two different approaches to generating explanations resulted in a 43% and 10% successrate. In a replication study, we were interested in whether the measurable improvements in GPT-4 performance led to a higher rate of success for generating valid explanations compared to GPT-3.5. A replication of the original study was conducted by using GPT-4 to generate explanations for the same problems given to GPT-3.5. Using GPT-4, explanation correctness dramatically improved to a success rate of 94%. We were further interested in evaluating if GPT-4 explanations were positively perceived compared to human-written explanations. A preregistered, follow-up study was implemented where 10 evaluators were asked to rate the quality of randomized GPT-4 and teacher-created explanations. Even with 4% of problems containing some amount of incorrect content, GPT-4 explanations were preferred over human explanations.more » « less
-
The monitoring of data streams with a network structure have drawn increasing attention due to its wide applications in modern process control. In these applications, high-dimensional sensor nodes are interconnected with an underlying network topology. In such a case, abnormalities occurring to any node may propagate dynamically across the network and cause changes of other nodes over time. Furthermore, high dimensionality of such data significantly increased the cost of resources for data transmission and computation, such that only partial observations can be transmitted or processed in practice. Overall, how to quickly detect abnormalities in such large networks with resource constraints remains a challenge, especially due to the sampling uncertainty under the dynamic anomaly occurrences and network-based patterns. In this paper, we incorporate network structure information into the monitoring and adaptive sampling methodologies for quick anomaly detection in large networks where only partial observations are available. We develop a general monitoring and adaptive sampling method and further extend it to the case with memory constraints, both of which exploit network distance and centrality information for better process monitoring and identification of abnormalities. Theoretical investigations of the proposed methods demonstrate their sampling efficiency on balancing between exploration and exploitation, as well as the detection performance guarantee. Numerical simulations and a case study on power network have demonstrated the superiority of the proposed methods in detecting various types of shifts. Note to Practitioners —Continuous monitoring of networks for anomalous events is critical for a large number of applications involving power networks, computer networks, epidemiological surveillance, social networks, etc. This paper aims at addressing the challenges in monitoring large networks in cases where monitoring resources are limited such that only a subset of nodes in the network is observable. Specifically, we integrate network structure information of nodes for constructing sequential detection methods via effective data augmentation, and for designing adaptive sampling algorithms to observe suspicious nodes that are likely to be abnormal. Then, the method is further generalized to the case that the memory of the computation is also constrained due to the network size. The developed method is greatly beneficial and effective for various anomaly patterns, especially when the initial anomaly randomly occurs to nodes in the network. The proposed methods are demonstrated to be capable of quickly detecting changes in the network and dynamically changes the sampling priority based on online observations in various cases, as shown in the theoretical investigation, simulations and case studies.more » « less
-
A search is performed for dark matter particles produced in association with a resonantly produced pair of b-quarks with 30 < mbb < 150 GeV using 140 fb−1 of proton-proton collisions at a center-of-mass energy of 13 TeV recorded by the ATLAS detector at the LHC. This signature is expected in extensions of the standard model predicting the production of dark matter particles, in particular those containing a dark Higgs boson s that decays into bb¯. The highly boosted s → bb¯ topology is reconstructed using jet reclustering and a new identification algorithm. This search places stringent constraints across regions of the dark Higgs model parameter space that satisfy the observed relic density, excluding dark Higgs bosons with masses between 30 and 150 GeV in benchmark scenarios with Z0 mediator masses up to 4.8 TeV at 95% confidence level.more » « lessFree, publicly-accessible full text available March 1, 2026
-
Free, publicly-accessible full text available November 1, 2025
-
Abstract The origin of high-energy galactic cosmic rays is yet to be understood, but some galactic cosmic-ray accelerators can accelerate cosmic rays up to PeV energies. The high-energy cosmic rays are expected to interact with the surrounding material or radiation, resulting in the production of gamma-rays and neutrinos. To optimize for the detection of such associated production of gamma-rays and neutrinos for a given source morphology and spectrum, a multimessenger analysis that combines gamma-rays and neutrinos is required. In this study, we use the Multi-Mission Maximum Likelihood framework with IceCube Maximum Likelihood Analysis software and HAWC Accelerated Likelihood to search for a correlation between 22 known gamma-ray sources from the third HAWC gamma-ray catalog and 14 yr of IceCube track-like data. No significant neutrino emission from the direction of the HAWC sources was found. We report the best-fit gamma-ray model and 90% CL neutrino flux limit from the 22 sources. From the neutrino flux limit, we conclude that, for five of the sources, the gamma-ray emission observed by HAWC cannot be produced purely from hadronic interactions. We report the limit for the fraction of gamma-rays produced by hadronic interactions for these five sources.more » « lessFree, publicly-accessible full text available November 1, 2025