skip to main content

Search for: All records

Creators/Authors contains: "Anton, G"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. In the decades since Papert published Mindstorms (1980), computation has transformed nearly every branch of scientific practice. Accordingly, there is increasing recognition that computation and computational thinking (CT) must be a core part of STEM education in a broad range of subjects. Previous work has demonstrated the efficacy of incorporating computation into STEM courses and introduced a taxonomy of CT practices in STEM. However, this work rarely involved teachers as more than implementers of units designed by researchers. In The Children’s Machine, Papert asked “What can be done to mobilize the potential force for change inherent in the position ofmore »teachers?” (Papert, 1994, pg. 79). We argue that involving teachers as co-design partners supports them to be cultural change agents in education. We report here on the first phase of a research project in which we worked with STEM educators to co-design curricular science units that incorporate computational thinking and practices. Eight high school teachers and one university professor joined nine members of our research team for a month-long Computational Thinking Summer Institute (CTSI). The co-design process was a constructionist design and learning experience for both the teachers and researchers. We focus here on understanding the co-design process and its implications for teachers by asking: (1) How did teachers shift in their attitudes and confidence regarding CT? (2) What different co-design styles emerged and did any tensions arise? Generally, we found that teachers gained confidence and skills in CT and computational tools over the course of the summer. Only one teacher reported a decrease in confidence in one aspect of CT (computational modeling), but this seemed to result from gaining a broader and more nuanced understanding of this rich area. A range of co-design styles emerged over the summer. Some teachers chose to focus on designing the curriculum and advising on the computational tools to be used in it, while leaving the construction of those tools to their co-designers. Other teachers actively participated in constructing models and computational tools themselves. The pluralism of co-design styles allowed teachers of various comfort levels with computation to meaningfully contribute to a computationally enhanced constructionist curriculum. However, it also led to a tension for some teachers between working to finish their curriculum versus gaining experience with computational tools. In the time crunch to complete their unit during CTSI, some teachers chose to save time by working on the curriculum while their co-design partners (researchers) created the supporting computational tools. These teachers still grew in their computational sophistication, but they could not devote as much time as they wanted to their own computational learning.« less
  2. Kong, S.C. (Ed.)
    This work aims to help high school STEM teachers integrate computational thinking (CT) into their classrooms by engaging teachers as curriculum co-designers. K-12 teachers who are not trained in computer science may not see the value of CT in STEM classrooms and how to engage their students in computational practices that reflect the practices of STEM professionals. To this end, we developed a 4-week professional development workshop for eight science and mathematics high school teachers to co-design computationally enhanced curriculum with our team of researchers. The workshop first provided an introduction to computational practices and tools for STEM education. Then,more »teachers engaged in co-design to enhance their science and mathematics curricula with computational practices in STEM. Data from surveys and interviews showed that teachers learned about computational thinking, computational tools, coding, and the value of collaboration after the professional development. Further, they were able to integrate multiple computational tools that engage their students in CT-STEM practices. These findings suggest that teachers can learn to use computational practices and tools through workshops, and that teachers collaborating with researchers in co-design to develop computational enhanced STEM curriculum may be a powerful way to engage students and teachers with CT in K-12 classrooms.« less
  3. Free, publicly-accessible full text available December 1, 2022
  4. Abstract The nEXO neutrinoless double beta (0 νββ ) decay experiment is designed to use a time projection chamber and 5000 kg of isotopically enriched liquid xenon to search for the decay in 136 Xe. Progress in the detector design, paired with higher fidelity in its simulation and an advanced data analysis, based on the one used for the final results of EXO-200, produce a sensitivity prediction that exceeds the half-life of 10 28 years. Specifically, improvements have been made in the understanding of production of scintillation photons and charge as well as of their transport and reconstruction in the detector.more »The more detailed knowledge of the detector construction has been paired with more assays for trace radioactivity in different materials. In particular, the use of custom electroformed copper is now incorporated in the design, leading to a substantial reduction in backgrounds from the intrinsic radioactivity of detector materials. Furthermore, a number of assumptions from previous sensitivity projections have gained further support from interim work validating the nEXO experiment concept. Together these improvements and updates suggest that the nEXO experiment will reach a half-life sensitivity of 1.35 × 10 28 yr at 90% confidence level in 10 years of data taking, covering the parameter space associated with the inverted neutrino mass ordering, along with a significant portion of the parameter space for the normal ordering scenario, for almost all nuclear matrix elements. The effects of backgrounds deviating from the nominal values used for the projections are also illustrated, concluding that the nEXO design is robust against a number of imperfections of the model.« less
    Free, publicly-accessible full text available December 3, 2022
  5. Abstract The Surface Enhancement of the IceTop air-shower array will include the addition of radio antennas and scintillator panels, co-located with the existing ice-Cherenkov tanks and covering an area of about 1 km 2 . Together, these will increase the sensitivity of the IceCube Neutrino Observatory to the electromagnetic and muonic components of cosmic-ray-induced air showers at the South Pole. The inclusion of the radio technique necessitates an expanded set of simulation and analysis tools to explore the radio-frequency emission from air showers in the 70 MHz to 350 MHz band. In this paper we describe the software modules thatmore »have been developed to work with time- and frequency-domain information within IceCube's existing software framework, IceTray, which is used by the entire IceCube collaboration. The software includes a method by which air-shower simulation, generated using CoREAS, can be reused via waveform interpolation, thus overcoming a significant computational hurdle in the field.« less
    Free, publicly-accessible full text available June 1, 2023
  6. Free, publicly-accessible full text available June 1, 2023
  7. Abstract We present a measurement of the high-energy astrophysical muon–neutrino flux with the IceCube Neutrino Observatory. The measurement uses a high-purity selection of 650k neutrino-induced muon tracks from the northern celestial hemisphere, corresponding to 9.5 yr of experimental data. With respect to previous publications, the measurement is improved by the increased size of the event sample and the extended model testing beyond simple power-law hypotheses. An updated treatment of systematic uncertainties and atmospheric background fluxes has been implemented based on recent models. The best-fit single power-law parameterization for the astrophysical energy spectrum results in a normalization of ϕ @ 100more »TeV ν μ + ν ¯ μ = 1.44 − 0.26 + 0.25 × 10 − 18 GeV − 1 cm − 2 s − 1 sr − 1 and a spectral index γ SPL = 2.37 − 0.09 + 0.09 , constrained in the energy range from 15 TeV to 5 PeV. The model tests include a single power law with a spectral cutoff at high energies, a log-parabola model, several source-class-specific flux predictions from the literature, and a model-independent spectral unfolding. The data are consistent with a single power-law hypothesis, however, spectra with softening above one PeV are statistically more favorable at a two-sigma level.« less
    Free, publicly-accessible full text available March 1, 2023
  8. Free, publicly-accessible full text available March 1, 2023