skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Thursday, January 16 until 2:00 AM ET on Friday, January 17 due to maintenance. We apologize for the inconvenience.


Title: STRIDES: automated uniform models for 30 quadruply imaged quasars
ABSTRACT

Gravitational time delays provide a powerful one-step measurement of H0, independent of all other probes. One key ingredient in time-delay cosmography are high-accuracy lens models. Those are currently expensive to obtain, both, in terms of computing and investigator time (105–106 CPU hours and ∼0.5–1 yr, respectively). Major improvements in modelling speed are therefore necessary to exploit the large number of lenses that are forecast to be discovered over the current decade. In order to bypass this roadblock, we develop an automated modelling pipeline and apply it to a sample of 31 lens systems, observed by the Hubble Space Telescope in multiple bands. Our automated pipeline can derive models for 30/31 lenses with few hours of human time and <100 CPU hours of computing time for a typical system. For each lens, we provide measurements of key parameters and predictions of magnification as well as time delays for the multiple images. We characterize the cosmography-readiness of our models using the stability of differences in the Fermat potential (proportional to time delay) with respect to modelling choices. We find that for 10/30 lenses, our models are cosmography or nearly cosmography grade (<3 per cent and 3–5 per cent variations). For 6/30 lenses, the models are close to cosmography grade (5–10 per cent). These results utilize informative priors and will need to be confirmed by further analysis. However, they are also likely to improve by extending the pipeline modelling sequence and options. In conclusion, we show that uniform cosmography grade modelling of large strong lens samples is within reach.

 
more » « less
Award ID(s):
1906976 1907396
PAR ID:
10473661
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; more » ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; « less
Corporate Creator(s):
Publisher / Repository:
MNRAS
Date Published:
Journal Name:
Monthly Notices of the Royal Astronomical Society
Volume:
518
Issue:
1
ISSN:
0035-8711
Page Range / eLocation ID:
1260 to 1300
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    ABSTRACT In recent years, breakthroughs in methods and data have enabled gravitational time delays to emerge as a very powerful tool to measure the Hubble constant H0. However, published state-of-the-art analyses require of order 1 yr of expert investigator time and up to a million hours of computing time per system. Furthermore, as precision improves, it is crucial to identify and mitigate systematic uncertainties. With this time delay lens modelling challenge, we aim to assess the level of precision and accuracy of the modelling techniques that are currently fast enough to handle of order 50 lenses, via the blind analysis of simulated data sets. The results in Rungs 1 and 2 show that methods that use only the point source positions tend to have lower precision ($10\!-\!20{{\ \rm per\ cent}}$) while remaining accurate. In Rung 2, the methods that exploit the full information of the imaging and kinematic data sets can recover H0 within the target accuracy (|A| < 2 per cent) and precision (<6 per cent per system), even in the presence of a poorly known point spread function and complex source morphology. A post-unblinding analysis of Rung 3 showed the numerical precision of the ray-traced cosmological simulations to be insufficient to test lens modelling methodology at the percent level, making the results difficult to interpret. A new challenge with improved simulations is needed to make further progress in the investigation of systematic uncertainties. For completeness, we present the Rung 3 results in an appendix and use them to discuss various approaches to mitigating against similar subtle data generation effects in future blind challenges. 
    more » « less
  2. When strong gravitational lenses are to be used as an astrophysical or cosmological probe, models of their mass distributions are often needed. We present a new, time-efficient automation code for the uniform modeling of strongly lensed quasars withGLEE, a lens-modeling software for multiband data. By using the observed positions of the lensed quasars and the spatially extended surface brightness distribution of the host galaxy of the lensed quasar, we obtain a model of the mass distribution of the lens galaxy. We applied this uniform modeling pipeline to a sample of nine strongly lensed quasars for which images were obtained with the Wide Field Camera 3 of theHubbleSpace Telescope. The models show well-reconstructed light components and a good alignment between mass and light centroids in most cases. We find that the automated modeling code significantly reduces the input time during the modeling process for the user. The time for preparing the required input files is reduced by a factor of 3 from ~3 h to about one hour. The active input time during the modeling process for the user is reduced by a factor of 10 from ~ 10 h to about one hour per lens system. This automated uniform modeling pipeline can efficiently produce uniform models of extensive lens-system samples that can be used for further cosmological analysis. A blind test that compared our results with those of an independent automated modeling pipeline based on the modeling softwareLenstronomyrevealed important lessons. Quantities such as Einstein radius, astrometry, mass flattening, and position angle are generally robustly determined. Other quantities, such as the radial slope of the mass density profile and predicted time delays, depend crucially on the quality of the data and on the accuracy with which the point spread function is reconstructed. Better data and/or a more detailed analysis are necessary to elevate our automated models to cosmography grade. Nevertheless, our pipeline enables the quick selection of lenses for follow-up and further modeling, which significantly speeds up the construction of cosmography-grade models. This important step forward will help us to take advantage of the increase in the number of lenses that is expected in the coming decade, which is an increase of several orders of magnitude.

     
    more » « less
  3. ABSTRACT

    The time-delay between appearances of multiple images of a gravitationally lensed supernova (glSN) is sensitive to the Hubble constant, $H_0$. As well as time-delays, a lensed host galaxy is needed to enable precise inference of $H_0$. In this work, we investigate the connection between discoverable lensed transients and their host galaxies. We find that the Legacy Survey of Space and Time (LSST) will discover at least 90 glSNe per year, of which 54 per cent will also have a strongly lensed host. The rates are uncertain by approximately 30 per cent depending primarily on the choice of the unlensed SN population and uncertainties in the redshift evolution of the deflector population, but the fraction of glSNe with a lensed host is consistently around a half. LSST will discover around 20 glSNe per year in systems that could plausibly have been identified by Euclid as galaxy–galaxy lenses before the discovery of the glSN. Such systems have preferentially longer time-delays and therefore are well suited for cosmography. We define a golden sample of glSNe Ia with time-delays over 10 d, image separations greater than 0.8 arcsec, and a multiply imaged host. For this golden sample, we find 91 per cent occur in systems that should already be discoverable as galaxy–galaxy lenses in Euclid. For cosmology with glSNe, monitoring Euclid lenses is a plausible alternative to searching the entire LSST alert stream.

     
    more » « less
  4. Time delay cosmography uses the arrival time delays between images in strong gravitational lenses to measure cosmological parameters, in particular the Hubble constant H 0 . The lens models used in time delay cosmography omit dark matter subhalos and line-of-sight halos because their effects are assumed to be negligible. We explicitly quantify this assumption by analyzing mock lens systems that include full populations of dark matter subhalos and line-of-sight halos, applying the same modeling assumptions used in the literature to infer H 0 . We base the mock lenses on six quadruply imaged quasars that have delivered measurements of the Hubble constant, and quantify the additional uncertainties and/or bias on a lens-by-lens basis. We show that omitting dark substructure does not bias inferences of H 0 . However, perturbations from substructure contribute an additional source of random uncertainty in the inferred value of H 0 that scales as the square root of the lensing volume divided by the longest time delay. This additional source of uncertainty, for which we provide a fitting function, ranges from 0.7 − 2.4%. It may need to be incorporated in the error budget as the precision of cosmographic inferences from single lenses improves, and it sets a precision limit on inferences from single lenses. 
    more » « less
  5. Time delay cosmography uses the arrival time delays between images in strong gravitational lenses to measure cosmological parameters, in particular the Hubble constant H0. The lens models used in time delay cosmography omit dark matter subhalos and line-of-sight halos because their effects are assumed to be negligible. We explicitly quantify this assumption by analyzing realistic mock lens systems that include full populations of dark matter subhalos and line-of-sight halos, applying the same modeling assumptions used in the literature to infer H0. We base the mock lenses on six quadruply-imaged quasars that have delivered measurements of the Hubble constant, and quantify the additional uncertainties and/or bias on a lens-by-lens basis. We show that omitting dark substructure does not bias inferences of H0. However, perturbations from substructure contribute an additional source of random uncertainty in the inferred value of H0 that scales as the square root of the lensing volume divided by the longest time delay. This additional source of uncertainty, for which we provide a fitting function, ranges from 0.6−2.4%. It may need to be incorporated in the error budget as the precision of cosmographic inferences from single lenses improves, and sets a precision limit on inferences from single lenses. 
    more » « less