Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available August 1, 2024
-
Free, publicly-accessible full text available July 1, 2024
-
Free, publicly-accessible full text available April 1, 2024
-
Free, publicly-accessible full text available January 1, 2024
-
Abstract Scattering of high energy particles from nucleons probes their structure, as was done in the experiments that established the non-zero size of the proton using electron beams 1 . The use of charged leptons as scattering probes enables measuring the distribution of electric charges, which is encoded in the vector form factors of the nucleon 2 . Scattering weakly interacting neutrinos gives the opportunity to measure both vector and axial vector form factors of the nucleon, providing an additional, complementary probe of their structure. The nucleon transition axial form factor, F A , can be measured from neutrino scattering from free nucleons, ν μ n → μ − p and $${\bar{\nu }}_{\mu }p\to {\mu }^{+}n$$ ν ¯ μ p → μ + n , as a function of the negative four-momentum transfer squared ( Q 2 ). Up to now, F A ( Q 2 ) has been extracted from the bound nucleons in neutrino–deuterium scattering 3–9 , which requires uncertain nuclear corrections 10 . Here we report the first high-statistics measurement, to our knowledge, of the $${\bar{\nu }}_{\mu }\,p\to {\mu }^{+}n$$ ν ¯ μ p → μ + n cross-section from the hydrogen atom, using the plastic scintillatormore »Free, publicly-accessible full text available February 2, 2024
-
Abstract We compare different neural network architectures for machine learning algorithms designed to identify the neutrino interaction vertex position in the MINERvA detector. The architectures developed and optimized by hand are compared with the architectures developed in an automated way using the package “Multi-node Evolutionary Neural Networks for Deep Learning” (MENNDL), developed at Oak Ridge National Laboratory. While the domain-expert hand-tuned network was the best performer, the differences were negligible and the auto-generated networks performed as well. There is always a trade-off between human, and computer resources for network optimization and this work suggests that automated optimization, assuming resources are available, provides a compelling way to save significant expert time.