skip to main content

Title: SYMBA: An end-to-end VLBI synthetic data generation pipeline: Simulating Event Horizon Telescope observations of M 87
Context. Realistic synthetic observations of theoretical source models are essential for our understanding of real observational data. In using synthetic data, one can verify the extent to which source parameters can be recovered and evaluate how various data corruption effects can be calibrated. These studies are the most important when proposing observations of new sources, in the characterization of the capabilities of new or upgraded instruments, and when verifying model-based theoretical predictions in a direct comparison with observational data. Aims. We present the SYnthetic Measurement creator for long Baseline Arrays ( SYMBA ), a novel synthetic data generation pipeline for Very Long Baseline Interferometry (VLBI) observations. SYMBA takes into account several realistic atmospheric, instrumental, and calibration effects. Methods. We used SYMBA to create synthetic observations for the Event Horizon Telescope (EHT), a millimetre VLBI array, which has recently captured the first image of a black hole shadow. After testing SYMBA with simple source and corruption models, we study the importance of including all corruption and calibration effects, compared to the addition of thermal noise only. Using synthetic data based on two example general relativistic magnetohydrodynamics (GRMHD) model images of M 87, we performed case studies to assess the image quality more » that can be obtained with the current and future EHT array for different weather conditions. Results. Our synthetic observations show that the effects of atmospheric and instrumental corruptions on the measured visibilities are significant. Despite these effects, we demonstrate how the overall structure of our GRMHD source models can be recovered robustly with the EHT2017 array after performing calibration steps, which include fringe fitting, a priori amplitude and network calibration, and self-calibration. With the planned addition of new stations to the EHT array in the coming years, images could be reconstructed with higher angular resolution and dynamic range. In our case study, these improvements allowed for a distinction between a thermal and a non-thermal GRMHD model based on salient features in reconstructed images. « less
Authors:
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; more » ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; « less
Award ID(s):
1903847 1852617 1248097 1816420 1716327 1743747
Publication Date:
NSF-PAR ID:
10144536
Journal Name:
Astronomy & Astrophysics
Volume:
636
Page Range or eLocation-ID:
A5
ISSN:
0004-6361
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract The black hole images obtained with the Event Horizon Telescope (EHT) are expected to be variable at the dynamical timescale near their horizons. For the black hole at the center of the M87 galaxy, this timescale (5–61 days) is comparable to the 6 day extent of the 2017 EHT observations. Closure phases along baseline triangles are robust interferometric observables that are sensitive to the expected structural changes of the images but are free of station-based atmospheric and instrumental errors. We explored the day-to-day variability in closure-phase measurements on all six linearly independent nontrivial baseline triangles that can be formedmore »from the 2017 observations. We showed that three triangles exhibit very low day-to-day variability, with a dispersion of ∼3°–5°. The only triangles that exhibit substantially higher variability (∼90°–180°) are the ones with baselines that cross the visibility amplitude minima on the u – v plane, as expected from theoretical modeling. We used two sets of general relativistic magnetohydrodynamic simulations to explore the dependence of the predicted variability on various black hole and accretion-flow parameters. We found that changing the magnetic field configuration, electron temperature model, or black hole spin has a marginal effect on the model consistency with the observed level of variability. On the other hand, the most discriminating image characteristic of models is the fractional width of the bright ring of emission. Models that best reproduce the observed small level of variability are characterized by thin ring-like images with structures dominated by gravitational lensing effects and thus least affected by turbulence in the accreting plasmas.« less
  2. ABSTRACT Recent observations with mm very long baseline interferometry (mm-VLBI) and near-infrared (NIR) interferometry provide mm images and NIR centroid proper motion for Sgr A*. Of particular interest are the NIR flares that have more than an order of magnitude higher flux density than the quiescent state. Here, we model the flares using time-dependent, axisymmetric, general relativistic magnetohydrodynamic (GRMHD) simulations with an electron distribution function that includes a small, variable, non-thermal component motivated by magnetic reconnection models. The models simultaneously match the observed mm mean flux density, mm image size, NIR quiescent flux density, NIR flare flux density, and NIRmore »spectral slope. They also provide a better fit to the observed NIR flux density probability density function than previously reported models by reproducing the power-law tail at high flux density, though with some discrepancy at low flux density. Further, our modelled NIR image centroid shows very little movement: centroid excursions of more than 10 μas (the resolution of GRAVITY) are rare and uncorrelated with flux.« less
  3. Abstract We introduce a new Markov Chain Monte Carlo (MCMC) algorithm with parallel tempering for fitting theoretical models of horizon-scale images of black holes to the interferometric data from the Event Horizon Telescope (EHT). The algorithm implements forms of the noise distribution in the data that are accurate for all signal-to-noise ratios. In addition to being trivially parallelizable, the algorithm is optimized for high performance, achieving 1 million MCMC chain steps in under 20 s on a single processor. We use synthetic data for the 2017 EHT coverage of M87 that are generated based on analytic as well as Generalmore »Relativistic Magnetohydrodynamic (GRMHD) model images to explore several potential sources of biases in fitting models to sparse interferometric data. We demonstrate that a very small number of data points that lie near salient features of the interferometric data exert disproportionate influence on the inferred model parameters. We also show that the preferred orientations of the EHT baselines introduce significant biases in the inference of the orientation of the model images. Finally, we discuss strategies that help identify the presence and severity of such biases in realistic applications.« less
  4. ABSTRACT

    Machine learning is becoming a popular tool to quantify galaxy morphologies and identify mergers. However, this technique relies on using an appropriate set of training data to be successful. By combining hydrodynamical simulations, synthetic observations, and convolutional neural networks (CNNs), we quantitatively assess how realistic simulated galaxy images must be in order to reliably classify mergers. Specifically, we compare the performance of CNNs trained with two types of galaxy images, stellar maps and dust-inclusive radiatively transferred images, each with three levels of observational realism: (1) no observational effects (idealized images), (2) realistic sky and point spread function (semirealistic images),more »and (3) insertion into a real sky image (fully realistic images). We find that networks trained on either idealized or semireal images have poor performance when applied to survey-realistic images. In contrast, networks trained on fully realistic images achieve 87.1 per cent classification performance. Importantly, the level of realism in the training images is much more important than whether the images included radiative transfer, or simply used the stellar maps ($87.1{{\ \rm per\ cent}}$ compared to $79.6{{\ \rm per\ cent}}$ accuracy, respectively). Therefore, one can avoid the large computational and storage cost of running radiative transfer with a relatively modest compromise in classification performance. Making photometry-based networks insensitive to colour incurs a very mild penalty to performance with survey-realistic data ($86.0{{\ \rm per\ cent}}$ with r-only compared to $87.1{{\ \rm per\ cent}}$ with gri). This result demonstrates that while colour can be exploited by colour-sensitive networks, it is not necessary to achieve high accuracy and so can be avoided if desired. We provide the public release of our statistical observational realism suite, RealSim, as a companion to this paper.

    « less
  5. The imaging fidelity of the Event Horizon Telescope (EHT) is currently determined by its sparse baseline coverage. In particular, EHT coverage is dominated by long baselines, and is highly sensitive to atmospheric conditions and loss of sites between experiments. The limited short/mid-range baselines especially affect the imaging process, hindering the recovery of more extended features in the image. We present an algorithmic contingency for the absence of well-constrained short baselines in the imaging of compact sources, such as the supermassive black holes observed with the EHT. This technique enforces a specific second moment on the reconstructed image in the formmore »of a size constraint, which corresponds to the curvature of the measured visibility function at zero baseline. The method enables the recovery of information lost in gaps of the baseline coverage on short baselines and enables corrections of any systematic amplitude offsets for the stations giving short-baseline measurements present in the observation. The regularization can use historical source size measurements to constrain the second moment of the reconstructed image to match the observed size. We additionally show that a characteristic size can be derived from available short-baseline measurements, extrapolated from other wavelengths, or estimated without complementary size constraints with parameter searches. We demonstrate the capabilities of this method for both static and movie reconstructions of variable sources.« less