skip to main content

Search for: All records

Creators/Authors contains: "Lynn, D."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract Despite the fact that most science learning takes place outside of school, little is known about how engagement in informal science learning (ISL) experiences affects learners’ knowledge, skill development, interest, or identities over long periods of time. Although substantial ISL research has documented short-term outcomes such as the learning that takes place during a science center visit, research suggests that the genuine benefits of informal experiences are long-term transformations in learners as they pursue a “cascade” of experiences subsequent to the initial educational event. However, a number of major methodological challenges have limited longitudinal research projects investigating the long-termmore »effects of ISL experiences. In this paper we identify and address four key issues surrounding the critical but challenging area of how to study and measure the long-term effects or impacts of ISL experiences: attribution, attrition, data collection, and analytic approaches. Our objective is to provide guidance to ISL researchers wishing to engage in long-term investigations of learner outcomes and to begin a dialogue about how best to address the numerous challenges involved in this work.« less
    Free, publicly-accessible full text available December 1, 2022
  2. Abstract The accurate simulation of additional interactions at the ATLAS experiment for the analysis of proton–proton collisions delivered by the Large Hadron Collider presents a significant challenge to the computing resources. During the LHC Run 2 (2015–2018), there were up to 70 inelastic interactions per bunch crossing, which need to be accounted for in Monte Carlo (MC) production. In this document, a new method to account for these additional interactions in the simulation chain is described. Instead of sampling the inelastic interactions and adding their energy deposits to a hard-scatter interaction one-by-one, the inelastic interactions are presampled, independent of the hardmore »scatter, and stored as combined events. Consequently, for each hard-scatter interaction, only one such presampled event needs to be added as part of the simulation chain. For the Run 2 simulation chain, with an average of 35 interactions per bunch crossing, this new method provides a substantial reduction in MC production CPU needs of around 20%, while reproducing the properties of the reconstructed quantities relevant for physics analyses with good accuracy.« less
    Free, publicly-accessible full text available December 1, 2023
  3. Abstract The ATLAS experiment at the Large Hadron Collider has a broad physics programme ranging from precision measurements to direct searches for new particles and new interactions, requiring ever larger and ever more accurate datasets of simulated Monte Carlo events. Detector simulation with Geant4 is accurate but requires significant CPU resources. Over the past decade, ATLAS has developed and utilized tools that replace the most CPU-intensive component of the simulation—the calorimeter shower simulation—with faster simulation methods. Here, AtlFast3, the next generation of high-accuracy fast simulation in ATLAS, is introduced. AtlFast3 combines parameterized approaches with machine-learning techniques and is deployed tomore »meet current and future computing challenges, and simulation needs of the ATLAS experiment. With highly accurate performance and significantly improved modelling of substructure within jets, AtlFast3 can simulate large numbers of events for a wide range of physics processes.« less
    Free, publicly-accessible full text available December 1, 2023
  4. Abstract Recent developments in very long baseline interferometry (VLBI) have made it possible for the Event Horizon Telescope (EHT) to resolve the innermost accretion flows of the largest supermassive black holes on the sky. The sparse nature of the EHT’s ( u , v )-coverage presents a challenge when attempting to resolve highly time-variable sources. We demonstrate that the changing ( u , v )-coverage of the EHT can contain regions of time over the course of a single observation that facilitate dynamical imaging. These optimal time regions typically have projected baseline distributions that are approximately angularly isotropic and radiallymore »homogeneous. We derive a metric of coverage quality based on baseline isotropy and density that is capable of ranking array configurations by their ability to produce accurate dynamical reconstructions. We compare this metric to existing metrics in the literature and investigate their utility by performing dynamical reconstructions on synthetic data from simulated EHT observations of sources with simple orbital variability. We then use these results to make recommendations for imaging the 2017 EHT Sgr A* data set.« less
    Free, publicly-accessible full text available May 1, 2023
  5. Abstract The extraordinary physical resolution afforded by the Event Horizon Telescope has opened a window onto the astrophysical phenomena unfolding on horizon scales in two known black holes, M87 * and Sgr A*. However, with this leap in resolution has come a new set of practical complications. Sgr A* exhibits intraday variability that violates the assumptions underlying Earth aperture synthesis, limiting traditional image reconstruction methods to short timescales and data sets with very sparse ( u , v ) coverage. We present a new set of tools to detect and mitigate this variability. We develop a data-driven, model-agnostic procedure tomore »detect and characterize the spatial structure of intraday variability. This method is calibrated against a large set of mock data sets, producing an empirical estimator of the spatial power spectrum of the brightness fluctuations. We present a novel Bayesian noise modeling algorithm that simultaneously reconstructs an average image and statistical measure of the fluctuations about it using a parameterized form for the excess variance in the complex visibilities not otherwise explained by the statistical errors. These methods are validated using a variety of simulated data, including general relativistic magnetohydrodynamic simulations appropriate for Sgr A* and M87 * . We find that the reconstructed source structure and variability are robust to changes in the underlying image model. We apply these methods to the 2017 EHT observations of M87 * , finding evidence for variability across the EHT observing campaign. The variability mitigation strategies presented are widely applicable to very long baseline interferometry observations of variable sources generally, for which they provide a data-informed averaging procedure and natural characterization of inter-epoch image consistency.« less
    Free, publicly-accessible full text available May 1, 2023
  6. Abstract In this paper we provide a first physical interpretation for the Event Horizon Telescope's (EHT) 2017 observations of Sgr A*. Our main approach is to compare resolved EHT data at 230 GHz and unresolved non-EHT observations from radio to X-ray wavelengths to predictions from a library of models based on time-dependent general relativistic magnetohydrodynamics simulations, including aligned, tilted, and stellar-wind-fed simulations; radiative transfer is performed assuming both thermal and nonthermal electron distribution functions. We test the models against 11 constraints drawn from EHT 230 GHz data and observations at 86 GHz, 2.2 μ m, and in the X-ray. Allmore »models fail at least one constraint. Light-curve variability provides a particularly severe constraint, failing nearly all strongly magnetized (magnetically arrested disk (MAD)) models and a large fraction of weakly magnetized models. A number of models fail only the variability constraints. We identify a promising cluster of these models, which are MAD and have inclination i ≤ 30°. They have accretion rate (5.2–9.5) × 10 −9 M ⊙ yr −1 , bolometric luminosity (6.8–9.2) × 10 35 erg s −1 , and outflow power (1.3–4.8) × 10 38 erg s −1 . We also find that all models with i ≥ 70° fail at least two constraints, as do all models with equal ion and electron temperature; exploratory, nonthermal model sets tend to have higher 2.2 μ m flux density; and the population of cold electrons is limited by X-ray constraints due to the risk of bremsstrahlung overproduction. Finally, we discuss physical and numerical limitations of the models, highlighting the possible importance of kinetic effects and duration of the simulations.« less
    Free, publicly-accessible full text available May 1, 2023
  7. Abstract We present a framework for characterizing the spatiotemporal power spectrum of the variability expected from the horizon-scale emission structure around supermassive black holes, and we apply this framework to a library of general relativistic magnetohydrodynamic (GRMHD) simulations and associated general relativistic ray-traced images relevant for Event Horizon Telescope (EHT) observations of Sgr A*. We find that the variability power spectrum is generically a red-noise process in both the temporal and spatial dimensions, with the peak in power occurring on the longest timescales and largest spatial scales. When both the time-averaged source structure and the spatially integrated light-curve variability aremore »removed, the residual power spectrum exhibits a universal broken power-law behavior. On small spatial frequencies, the residual power spectrum rises as the square of the spatial frequency and is proportional to the variance in the centroid of emission. Beyond some peak in variability power, the residual power spectrum falls as that of the time-averaged source structure, which is similar across simulations; this behavior can be naturally explained if the variability arises from a multiplicative random field that has a steeper high-frequency power-law index than that of the time-averaged source structure. We briefly explore the ability of power spectral variability studies to constrain physical parameters relevant for the GRMHD simulations, which can be scaled to provide predictions for black holes in a range of systems in the optically thin regime. We present specific expectations for the behavior of the M87* and Sgr A* accretion flows as observed by the EHT.« less
    Free, publicly-accessible full text available May 1, 2023
  8. Abstract The Event Horizon Telescope (EHT) observed the compact radio source, Sagittarius A* (Sgr A*), in the Galactic Center on 2017 April 5–11 in the 1.3 mm wavelength band. At the same time, interferometric array data from the Atacama Large Millimeter/submillimeter Array and the Submillimeter Array were collected, providing Sgr A* light curves simultaneous with the EHT observations. These data sets, complementing the EHT very long baseline interferometry, are characterized by a cadence and signal-to-noise ratio previously unattainable for Sgr A* at millimeter wavelengths, and they allow for the investigation of source variability on timescales as short as a minute.more »While most of the light curves correspond to a low variability state of Sgr A*, the April 11 observations follow an X-ray flare and exhibit strongly enhanced variability. All of the light curves are consistent with a red-noise process, with a power spectral density (PSD) slope measured to be between −2 and −3 on timescales between 1 minute and several hours. Our results indicate a steepening of the PSD slope for timescales shorter than 0.3 hr. The spectral energy distribution is flat at 220 GHz, and there are no time lags between the 213 and 229 GHz frequency bands, suggesting low optical depth for the event horizon scale source. We characterize Sgr A*’s variability, highlighting the different behavior observed just after the X-ray flare, and use Gaussian process modeling to extract a decorrelation timescale and a PSD slope. We also investigate the systematic calibration uncertainties by analyzing data from independent data reduction pipelines.« less
    Free, publicly-accessible full text available May 1, 2023