skip to main content

Search for: All records

Creators/Authors contains: "Thyagarajan, Nithyanandan"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.


    Next-generation aperture arrays are expected to consist of hundreds to thousands of antenna elements with substantial digital signal processing to handle large operating bandwidths of a few tens to hundreds of MHz. Conventionally, FX correlators are used as the primary signal processing unit of the interferometer. These correlators have computational costs that scale as $\mathcal {O}(N^2)$ for large arrays. An alternative imaging approach is implemented in the E-field Parallel Imaging Correlator (EPIC) that was recently deployed on the Long Wavelength Array station at the Sevilleta National Wildlife Refuge (LWA-SV) in New Mexico. EPIC uses a novel architecture that produces electric field or intensity images of the sky at the angular resolution of the array with full or partial polarization and the full spectral resolution of the channelizer. By eliminating the intermediate cross-correlation data products, the computational costs can be significantly lowered in comparison to a conventional FX or XF correlator from $\mathcal {O}(N^2)$ to $\mathcal {O}(N \log N)$ for dense (but otherwise arbitrary) array layouts. EPIC can also lower the output data rates by directly yielding polarimetric image products for science analysis. We have optimized EPIC and have now commissioned it at LWA-SV as a commensal all-sky imaging back-end that can potentiallymore »detect and localize sources of impulsive radio emission on millisecond timescales. In this article, we review the architecture of EPIC, describe code optimizations that improve performance, and present initial validations from commissioning observations. Comparisons between EPIC measurements and simultaneous beam-formed observations of bright sources show spectral-temporal structures in good agreement.

    « less

    Combining the visibilities measured by an interferometer to form a cosmological power spectrum is a complicated process. In a delay-based analysis, the mapping between instrumental and cosmological space is not a one-to-one relation. Instead, neighbouring modes contribute to the power measured at one point, with their respective contributions encoded in the window functions. To better understand the power measured by an interferometer, we assess the impact of instrument characteristics and analysis choices on these window functions. Focusing on the Hydrogen Epoch of Reionization Array (HERA) as a case study, we find that long-baseline observations correspond to enhanced low-k tails of the window functions, which facilitate foreground leakage, whilst an informed choice of bandwidth and frequency taper can reduce said tails. With simple test cases and realistic simulations, we show that, apart from tracing mode mixing, the window functions help accurately reconstruct the power spectrum estimator of simulated visibilities. The window functions depend strongly on the beam chromaticity and less on its spatial structure – a Gaussian approximation, ignoring side lobes, is sufficient. Finally, we investigate the potential of asymmetric window functions, down-weighting the contribution of low-k power to avoid foreground leakage. The window functions presented here correspond to themore »latest HERA upper limits for the full Phase I data. They allow an accurate reconstruction of the power spectrum measured by the instrument and will be used in future analyses to confront theoretical models and data directly in cylindrical space.

    « less
  3. Abstract

    We report the most sensitive upper limits to date on the 21 cm epoch of reionization power spectrum using 94 nights of observing with Phase I of the Hydrogen Epoch of Reionization Array (HERA). Using similar analysis techniques as in previously reported limits, we find at 95% confidence that Δ2(k= 0.34hMpc−1) ≤ 457 mK2atz= 7.9 and that Δ2(k= 0.36hMpc−1) ≤ 3496 mK2atz= 10.4, an improvement by a factor of 2.1 and 2.6, respectively. These limits are mostly consistent with thermal noise over a wide range ofkafter our data quality cuts, despite performing a relatively conservative analysis designed to minimize signal loss. Our results are validated with both statistical tests on the data and end-to-end pipeline simulations. We also report updated constraints on the astrophysics of reionization and the cosmic dawn. Using multiple independent modeling and inference techniques previously employed by HERA Collaboration, we find that the intergalactic medium must have been heated above the adiabatic cooling limit at least as early asz= 10.4, ruling out a broad set of so-called “cold reionization” scenarios. If this heating is due to high-mass X-ray binaries during the cosmic dawn, as is generally believed, our result’s 99% credible interval excludes the local relationshipmore »between soft X-ray luminosity and star formation and thus requires heating driven by evolved low-metallicity stars.

    « less

    We present a Bayesian jackknife test for assessing the probability that a data set contains biased subsets, and, if so, which of the subsets are likely to be biased. The test can be used to assess the presence and likely source of statistical tension between different measurements of the same quantities in an automated manner. Under certain broadly applicable assumptions, the test is analytically tractable. We also provide an open-source code, chiborg, that performs both analytic and numerical computations of the test on general Gaussian-distributed data. After exploring the information theoretical aspects of the test and its performance with an array of simulations, we apply it to data from the Hydrogen Epoch of Reionization Array (HERA) to assess whether different sub-seasons of observing can justifiably be combined to produce a deeper 21 cm power spectrum upper limit. We find that, with a handful of exceptions, the HERA data in question are statistically consistent and this decision is justified. We conclude by pointing out the wide applicability of this test, including to CMB experiments and the H0 tension.

  5. ABSTRACT The recent demonstration of a real-time direct imaging radio interferometry correlator represents a new capability in radio astronomy. However, wide-field imaging with this method is challenging since wide-field effects and array non-coplanarity degrade image quality if not compensated for. Here, we present an alternative direct imaging correlation strategy using a direct Fourier transform (DFT), modelled as a linear operator facilitating a matrix multiplication between the DFT matrix and a vector of the electric fields from each antenna. This offers perfect correction for wide field and non-coplanarity effects. When implemented with data from the Long Wavelength Array (LWA), it offers comparable computational performance to previously demonstrated direct imaging techniques, despite having a theoretically higher floating point cost. It also has additional benefits, such as imaging sparse arrays and control over which sky coordinates are imaged, allowing variable pixel placement across an image. It is in practice a highly flexible and efficient method of direct radio imaging when implemented on suitable arrays. A functioning electric field direct imaging architecture using the DFT is presented, alongside an exploration of techniques for wide-field imaging similar to those in visibility-based imaging, and an explanation of why they do not fit well to imaging directlymore »with the digitized electric field data. The DFT imaging method is demonstrated on real data from the LWA telescope, alongside a detailed performance analysis, as well as an exploration of its applicability to other arrays.« less
  6. Abstract

    Motivated by the desire for wide-field images with well-defined statistical properties for 21 cm cosmology, we implement an optimal mapping pipeline that computes a maximum likelihood estimator for the sky using the interferometric measurement equation. We demonstrate this “direct optimal mapping” with data from the Hydrogen Epoch of Reionization (HERA) Phase I observations. After validating the pipeline with simulated data, we develop a maximum likelihood figure-of-merit for comparing four sky models at 166 MHz with a bandwidth of 100 kHz. The HERA data agree with the GLEAM catalogs to < 10%. After subtracting the GLEAM point sources, the HERA data discriminate between the different continuum sky models, providing most support for the model of Byrne et al. We report the computation cost for mapping the HERA Phase I data and project the computation for the HERA 320-antenna data; both are feasible with a modern server. The algorithm is broadly applicable to other interferometers and is valid for wide-field and noncoplanar arrays.

  7. Abstract We report upper limits on the Epoch of Reionization 21 cm power spectrum at redshifts 7.9 and 10.4 with 18 nights of data (∼36 hr of integration) from Phase I of the Hydrogen Epoch of Reionization Array (HERA). The Phase I data show evidence for systematics that can be largely suppressed with systematic models down to a dynamic range of ∼10 9 with respect to the peak foreground power. This yields a 95% confidence upper limit on the 21 cm power spectrum of Δ 21 2 ≤ ( 30.76 ) 2 mK 2 at k = 0.192 h Mpc −1 at z = 7.9, and also Δ 21 2 ≤ ( 95.74 ) 2 mK 2 at k = 0.256 h Mpc −1 at z = 10.4. At z = 7.9, these limits are the most sensitive to date by over an order of magnitude. While we find evidence for residual systematics at low line-of-sight Fourier k ∥ modes, at high k ∥ modes we find our data to be largely consistent with thermal noise, an indicator that the system could benefit from deeper integrations. The observed systematics could be due to radio frequency interference, cable subreflections, or residualmore »instrumental cross-coupling, and warrant further study. This analysis emphasizes algorithms that have minimal inherent signal loss, although we do perform a careful accounting in a companion paper of the small forms of loss or bias associated with the pipeline. Overall, these results are a promising first step in the development of a tuned, instrument-specific analysis pipeline for HERA, particularly as Phase II construction is completed en route to reaching the full sensitivity of the experiment.« less