skip to main content

Search for: All records

Award ID contains: 1906976

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available June 1, 2023
  2. ABSTRACT Strongly lensed quadruply imaged quasars (quads) are extraordinary objects. They are very rare in the sky and yet they provide unique information about a wide range of topics, including the expansion history and the composition of the Universe, the distribution of stars and dark matter in galaxies, the host galaxies of quasars, and the stellar initial mass function. Finding them in astronomical images is a classic ‘needle in a haystack’ problem, as they are outnumbered by other (contaminant) sources by many orders of magnitude. To solve this problem, we develop state-of-the-art deep learning methods and train them on realisticmore »simulated quads based on real images of galaxies taken from the Dark Energy Survey, with realistic source and deflector models, including the chromatic effects of microlensing. The performance of the best methods on a mixture of simulated and real objects is excellent, yielding area under the receiver operating curve in the range of 0.86–0.89. Recall is close to 100 per cent down to total magnitude i ∼ 21 indicating high completeness, while precision declines from 85 per cent to 70 per cent in the range i ∼ 17–21. The methods are extremely fast: training on 2 million samples takes 20 h on a GPU machine, and 108 multiband cut-outs can be evaluated per GPU-hour. The speed and performance of the method pave the way to apply it to large samples of astronomical sources, bypassing the need for photometric pre-selection that is likely to be a major cause of incompleteness in current samples of known quads.« less
    Free, publicly-accessible full text available May 5, 2023
  3. ABSTRACT Strongly lensed quasars can provide measurements of the Hubble constant (H0) independent of any other methods. One of the key ingredients is exquisite high-resolution imaging data, such as Hubble Space Telescope (HST) imaging and adaptive-optics (AO) imaging from ground-based telescopes, which provide strong constraints on the mass distribution of the lensing galaxy. In this work, we expand on the previous analysis of three time-delay lenses with AO imaging (RX J1131−1231, HE 0435−1223, and PG 1115+080), and perform a joint analysis of J0924+0219 by using AO imaging from the Keck telescope, obtained as part of the Strong lensing at High Angular Resolution Program (SHARP)more »AO effort, with HST imaging to constrain the mass distribution of the lensing galaxy. Under the assumption of a flat Λ cold dark matter (ΛCDM) model with fixed Ωm = 0.3, we show that by marginalizing over two different kinds of mass models (power-law and composite models) and their transformed mass profiles via a mass-sheet transformation, we obtain $\Delta t_{\rm BA}=6.89\substack{+0.8\\-0.7}\, h^{-1}\hat{\sigma }_{v}^{2}$ d, $\Delta t_{\rm CA}=10.7\substack{+1.6\\-1.2}\, h^{-1}\hat{\sigma }_{v}^{2}$ d, and $\Delta t_{\rm DA}=7.70\substack{+1.0\\-0.9}\, h^{-1}\hat{\sigma }_{v}^{2}$ d, where $h=H_{0}/100\,\rm km\, s^{-1}\, Mpc^{-1}$ is the dimensionless Hubble constant and $\hat{\sigma }_{v}=\sigma ^{\rm ob}_{v}/(280\,\rm km\, s^{-1})$ is the scaled dimensionless velocity dispersion. Future measurements of time delays with 10 per cent uncertainty and velocity dispersion with 5 per cent uncertainty would yield a H0 constraint of ∼15 per cent precision.« less
    Free, publicly-accessible full text available May 5, 2023
  4. Abstract Identifying multiply imaged quasars is challenging owing to their low density in the sky and the limited angular resolution of wide-field surveys. We show that multiply imaged quasars can be identified using unresolved light curves, without assuming a light-curve template or any prior information. After describing our method, we show, using simulations, that it can attain high precision and recall when we consider high-quality data with negligible noise well below the variability of the light curves. As the noise level increases to that of the Zwicky Transient Facility telescope, we find that precision can remain close to 100% whilemore »recall drops to ∼60%. We also consider some examples from Time Delay Challenge 1 and demonstrate that the time delays can be accurately recovered from the joint light-curve data in realistic observational scenarios. We further demonstrate our method by applying it to publicly available COSMOGRAIL data of the observed lensed quasar SDSS J1226−0006. We identify the system as a lensed quasar based on the unresolved light curve and estimate a time delay in good agreement with the one measured by COSMOGRAIL using the individual image light curves. The technique shows great potential to identify lensed quasars in wide-field imaging surveys, especially the soon-to-be-commissioned Vera Rubin Observatory.« less
    Free, publicly-accessible full text available March 1, 2023
  5. ABSTRACT Astrometric precision and knowledge of the point spread function are key ingredients for a wide range of astrophysical studies including time-delay cosmography in which strongly lensed quasar systems are used to determine the Hubble constant and other cosmological parameters. Astrometric uncertainty on the positions of the multiply-imaged point sources contributes to the overall uncertainty in inferred distances and therefore the Hubble constant. Similarly, knowledge of the wings of the point spread function is necessary to disentangle light from the background sources and the foreground deflector. We analyse adaptive optics (AO) images of the strong lens system J 0659+1629 obtained withmore »the W. M. Keck Observatory using the laser guide star AO system. We show that by using a reconstructed point spread function we can (i) obtain astrometric precision of <1 mas, which is more than sufficient for time-delay cosmography; and (ii) subtract all point-like images resulting in residuals consistent with the noise level. The method we have developed is not limited to strong lensing, and is generally applicable to a wide range of scientific cases that have multiple point sources nearby.« less
    Free, publicly-accessible full text available September 28, 2022
  6. ABSTRACT Strongly lensed explosive transients such as supernovae, gamma-ray bursts, fast radio bursts, and gravitational waves are very promising tools to determine the Hubble constant (H0) in the near future in addition to strongly lensed quasars. In this work, we show that the transient nature of the point source provides an advantage over quasars: The lensed host galaxy can be observed before or after the transient’s appearance. Therefore, the lens model can be derived from images free of contamination from bright point sources. We quantify this advantage by comparing the precision of a lens model obtained from the same lensesmore »with and without point sources. Based on Hubble Space Telescope (HST) Wide Field Camera 3 (WFC3) observations with the same sets of lensing parameters, we simulate realistic mock data sets of 48 quasar lensing systems (i.e. adding AGN in the galaxy centre) and 48 galaxy–galaxy lensing systems (assuming the transient source is not visible but the time delay and image positions have been or will be measured). We then model the images and compare the inferences of the lens model parameters and H0. We find that the precision of the lens models (in terms of the deflector mass slope) is better by a factor of 4.1 for the sample without lensed point sources, resulting in an increase of H0 precision by a factor of 2.9. The opportunity to observe the lens systems without the transient point sources provides an additional advantage for time-delay cosmography over lensed quasars. It facilitates the determination of higher signal-to-noise stellar kinematics of the main deflector, and thus its mass density profile, which, in turn plays a key role in breaking the mass-sheet degeneracy and constraining H0.« less
  7. Strong lensing time delays can measure the Hubble constant H 0 independently of any other probe. Assuming commonly used forms for the radial mass density profile of the lenses, a 2% precision has been achieved with seven Time-Delay Cosmography (TDCOSMO) lenses, in tension with the H 0 from the cosmic microwave background. However, without assumptions on the radial mass density profile – and relying exclusively on stellar kinematics to break the mass-sheet degeneracy – the precision drops to 8% with the current data obtained using the seven TDCOSMO lenses, which is insufficient to resolve the H 0 tension. With themore »addition of external information from 33 Sloan Lens ACS (SLACS) lenses, the precision improves to 5% if the deflectors of TDCOSMO and SLACS lenses are drawn from the same population. We investigate the prospect of improving the precision of time-delay cosmography without relying on mass profile assumptions to break the mass-sheet degeneracy. Our forecasts are based on a previously published hierarchical framework. With existing samples and technology, 3.3% precision on H 0 can be reached by adding spatially resolved kinematics of the seven TDCOSMO lenses. The precision improves to 2.5% with the further addition of kinematics for 50 nontime-delay lenses from SLACS and the Strong Lensing Legacy Survey. Expanding the samples to 40 time-delay and 200 nontime-delay lenses will improve the precision to 1.5% and 1.2%, respectively. Time-delay cosmography can reach sufficient precision to resolve the Hubble tension at 3–5 σ , without assumptions on the radial mass profile of lens galaxies. By obtaining this precision with and without external datasets, we will test the consistency of the samples and enable further improvements based on even larger future samples of time-delay and nontime-delay lenses (e.g., from the Rubin , Euclid , and Roman Observatories).« less
  8. ABSTRACT We investigate the internal structure of elliptical galaxies at z ∼ 0.2 from a joint lensing–dynamics analysis. We model Hubble Space Telescope images of a sample of 23 galaxy–galaxy lenses selected from the Sloan Lens ACS (SLACS) survey. Whereas the original SLACS analysis estimated the logarithmic slopes by combining the kinematics with the imaging data, we estimate the logarithmic slopes only from the imaging data. We find that the distribution of the lensing-only logarithmic slopes has a median 2.08c ± 0.03 and intrinsic scatter 0.13 ± 0.02, consistent with the original SLACS analysis. We combine the lensing constraints with the stellar kinematics andmore »weak lensing measurements, and constrain the amount of adiabatic contraction in the dark matter (DM) haloes. We find that the DM haloes are well described by a standard Navarro–Frenk–White halo with no contraction on average for both of a constant stellar mass-to-light ratio (M/L) model and a stellar M/L gradient model. For the M/L gradient model, we find that most galaxies are consistent with no M/L gradient. Comparison of our inferred stellar masses with those obtained from the stellar population synthesis method supports a heavy initial mass function (IMF) such as the Salpeter IMF. We discuss our results in the context of previous observations and simulations, and argue that our result is consistent with a scenario in which active galactic nucleus feedback counteracts the baryonic-cooling-driven contraction in the DM haloes.« less
  9. ABSTRACT We report upon 3 years of follow-up and confirmation of doubly imaged quasar lenses through imaging campaigns from 2016 to 2018 with the Near-Infrared Camera2 (NIRC2) on the W. M. Keck Observatory. A sample of 57 quasar lens candidates are imaged in adaptive-optics-assisted or seeing-limited K′-band observations. Out of these 57 candidates, 15 are confirmed as lenses. We form a sample of 20 lenses adding in a number of previously known lenses that were imaged with NIRC2 in 2013–14 as part of a pilot study. By modelling these 20 lenses, we obtain K′-band relative photometry and astrometry of themore »quasar images and the lens galaxy. We also provide the lens properties and predicted time delays to aid planning of follow-up observations necessary for various astrophysical applications, e.g. spectroscopic follow-up to obtain the deflector redshifts for the newly confirmed systems. We compare the departure of the observed flux ratios from the smooth-model predictions between doubly and quadruply imaged quasar systems. We find that the departure is consistent between these two types of lenses if the modelling uncertainty is comparable.« less
  10. ABSTRACT In recent years, breakthroughs in methods and data have enabled gravitational time delays to emerge as a very powerful tool to measure the Hubble constant H0. However, published state-of-the-art analyses require of order 1 yr of expert investigator time and up to a million hours of computing time per system. Furthermore, as precision improves, it is crucial to identify and mitigate systematic uncertainties. With this time delay lens modelling challenge, we aim to assess the level of precision and accuracy of the modelling techniques that are currently fast enough to handle of order 50 lenses, via the blind analysismore »of simulated data sets. The results in Rungs 1 and 2 show that methods that use only the point source positions tend to have lower precision ($10\!-\!20{{\ \rm per\ cent}}$) while remaining accurate. In Rung 2, the methods that exploit the full information of the imaging and kinematic data sets can recover H0 within the target accuracy (|A| < 2 per cent) and precision (<6 per cent per system), even in the presence of a poorly known point spread function and complex source morphology. A post-unblinding analysis of Rung 3 showed the numerical precision of the ray-traced cosmological simulations to be insufficient to test lens modelling methodology at the percent level, making the results difficult to interpret. A new challenge with improved simulations is needed to make further progress in the investigation of systematic uncertainties. For completeness, we present the Rung 3 results in an appendix and use them to discuss various approaches to mitigating against similar subtle data generation effects in future blind challenges.« less