skip to main content

Search for: All records

Creators/Authors contains: "Bernardi, Gianni"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. ABSTRACT

    We present MeqSilhouette v2.0 (MeqSv2), a fully polarimetric, time-and frequency-resolved synthetic data generation software for simulating millimetre (mm) wavelength very long baseline interferometry (VLBI) observations with heterogeneous arrays. Synthetic data are a critical component in understanding real observations, testing calibration and imaging algorithms, and predicting performance metrics of existing or proposed sites. MeqSv2 applies physics-based instrumental and atmospheric signal corruptions constrained by empirically derived site and station parameters to the data. The new version is capable of applying instrumental polarization effects and various other spectrally resolved effects using the Radio Interferometry Measurement Equation (RIME) formalism and produces synthetic data compatible with calibration pipelines designed to process real data. We demonstrate the various corruption capabilities of MeqSv2 using different arrays, with a focus on the effect of complex bandpass gains on closure quantities for the EHT at 230 GHz. We validate the frequency-dependent polarization leakage implementation by performing polarization self-calibration of synthetic EHT data using PolSolve. We also note the potential applications for cm-wavelength VLBI array analysis and design and future directions.

  2. ABSTRACT

    Combining the visibilities measured by an interferometer to form a cosmological power spectrum is a complicated process. In a delay-based analysis, the mapping between instrumental and cosmological space is not a one-to-one relation. Instead, neighbouring modes contribute to the power measured at one point, with their respective contributions encoded in the window functions. To better understand the power measured by an interferometer, we assess the impact of instrument characteristics and analysis choices on these window functions. Focusing on the Hydrogen Epoch of Reionization Array (HERA) as a case study, we find that long-baseline observations correspond to enhanced low-k tails of the window functions, which facilitate foreground leakage, whilst an informed choice of bandwidth and frequency taper can reduce said tails. With simple test cases and realistic simulations, we show that, apart from tracing mode mixing, the window functions help accurately reconstruct the power spectrum estimator of simulated visibilities. The window functions depend strongly on the beam chromaticity and less on its spatial structure – a Gaussian approximation, ignoring side lobes, is sufficient. Finally, we investigate the potential of asymmetric window functions, down-weighting the contribution of low-k power to avoid foreground leakage. The window functions presented here correspond to themore »latest HERA upper limits for the full Phase I data. They allow an accurate reconstruction of the power spectrum measured by the instrument and will be used in future analyses to confront theoretical models and data directly in cylindrical space.

    « less
  3. Abstract

    We report the most sensitive upper limits to date on the 21 cm epoch of reionization power spectrum using 94 nights of observing with Phase I of the Hydrogen Epoch of Reionization Array (HERA). Using similar analysis techniques as in previously reported limits, we find at 95% confidence that Δ2(k= 0.34hMpc−1) ≤ 457 mK2atz= 7.9 and that Δ2(k= 0.36hMpc−1) ≤ 3496 mK2atz= 10.4, an improvement by a factor of 2.1 and 2.6, respectively. These limits are mostly consistent with thermal noise over a wide range ofkafter our data quality cuts, despite performing a relatively conservative analysis designed to minimize signal loss. Our results are validated with both statistical tests on the data and end-to-end pipeline simulations. We also report updated constraints on the astrophysics of reionization and the cosmic dawn. Using multiple independent modeling and inference techniques previously employed by HERA Collaboration, we find that the intergalactic medium must have been heated above the adiabatic cooling limit at least as early asz= 10.4, ruling out a broad set of so-called “cold reionization” scenarios. If this heating is due to high-mass X-ray binaries during the cosmic dawn, as is generally believed, our result’s 99% credible interval excludes the local relationshipmore »between soft X-ray luminosity and star formation and thus requires heating driven by evolved low-metallicity stars.

    « less
  4. ABSTRACT

    We present a Bayesian jackknife test for assessing the probability that a data set contains biased subsets, and, if so, which of the subsets are likely to be biased. The test can be used to assess the presence and likely source of statistical tension between different measurements of the same quantities in an automated manner. Under certain broadly applicable assumptions, the test is analytically tractable. We also provide an open-source code, chiborg, that performs both analytic and numerical computations of the test on general Gaussian-distributed data. After exploring the information theoretical aspects of the test and its performance with an array of simulations, we apply it to data from the Hydrogen Epoch of Reionization Array (HERA) to assess whether different sub-seasons of observing can justifiably be combined to produce a deeper 21 cm power spectrum upper limit. We find that, with a handful of exceptions, the HERA data in question are statistically consistent and this decision is justified. We conclude by pointing out the wide applicability of this test, including to CMB experiments and the H0 tension.

  5. Abstract

    Motivated by the desire for wide-field images with well-defined statistical properties for 21 cm cosmology, we implement an optimal mapping pipeline that computes a maximum likelihood estimator for the sky using the interferometric measurement equation. We demonstrate this “direct optimal mapping” with data from the Hydrogen Epoch of Reionization (HERA) Phase I observations. After validating the pipeline with simulated data, we develop a maximum likelihood figure-of-merit for comparing four sky models at 166 MHz with a bandwidth of 100 kHz. The HERA data agree with the GLEAM catalogs to < 10%. After subtracting the GLEAM point sources, the HERA data discriminate between the different continuum sky models, providing most support for the model of Byrne et al. We report the computation cost for mapping the HERA Phase I data and project the computation for the HERA 320-antenna data; both are feasible with a modern server. The algorithm is broadly applicable to other interferometers and is valid for wide-field and noncoplanar arrays.

  6. Abstract We report upper limits on the Epoch of Reionization 21 cm power spectrum at redshifts 7.9 and 10.4 with 18 nights of data (∼36 hr of integration) from Phase I of the Hydrogen Epoch of Reionization Array (HERA). The Phase I data show evidence for systematics that can be largely suppressed with systematic models down to a dynamic range of ∼10 9 with respect to the peak foreground power. This yields a 95% confidence upper limit on the 21 cm power spectrum of Δ 21 2 ≤ ( 30.76 ) 2 mK 2 at k = 0.192 h Mpc −1 at z = 7.9, and also Δ 21 2 ≤ ( 95.74 ) 2 mK 2 at k = 0.256 h Mpc −1 at z = 10.4. At z = 7.9, these limits are the most sensitive to date by over an order of magnitude. While we find evidence for residual systematics at low line-of-sight Fourier k ∥ modes, at high k ∥ modes we find our data to be largely consistent with thermal noise, an indicator that the system could benefit from deeper integrations. The observed systematics could be due to radio frequency interference, cable subreflections, or residualmore »instrumental cross-coupling, and warrant further study. This analysis emphasizes algorithms that have minimal inherent signal loss, although we do perform a careful accounting in a companion paper of the small forms of loss or bias associated with the pipeline. Overall, these results are a promising first step in the development of a tuned, instrument-specific analysis pipeline for HERA, particularly as Phase II construction is completed en route to reaching the full sensitivity of the experiment.« less
  7. Abstract We describe the validation of the HERA Phase I software pipeline by a series of modular tests, building up to an end-to-end simulation. The philosophy of this approach is to validate the software and algorithms used in the Phase I upper-limit analysis on wholly synthetic data satisfying the assumptions of that analysis, not addressing whether the actual data meet these assumptions. We discuss the organization of this validation approach, the specific modular tests performed, and the construction of the end-to-end simulations. We explicitly discuss the limitations in scope of the current simulation effort. With mock visibility data generated from a known analytic power spectrum and a wide range of realistic instrumental effects and foregrounds, we demonstrate that the current pipeline produces power spectrum estimates that are consistent with known analytic inputs to within thermal noise levels (at the 2 σ level) for k > 0.2 h Mpc −1 for both bands and fields considered. Our input spectrum is intentionally amplified to enable a strong “detection” at k ∼ 0.2 h Mpc −1 —at the level of ∼25 σ —with foregrounds dominating on larger scales and thermal noise dominating at smaller scales. Our pipeline is able to detect this amplifiedmore »input signal after suppressing foregrounds with a dynamic range (foreground to noise ratio) of ≳10 7 . Our validation test suite uncovered several sources of scale-independent signal loss throughout the pipeline, whose amplitude is well-characterized and accounted for in the final estimates. We conclude with a discussion of the steps required for the next round of data analysis.« less