skip to main content


Title: Optimizing machine learning methods to discover strong gravitational lenses in the deep lens survey
ABSTRACT

Machine learning models can greatly improve the search for strong gravitational lenses in imaging surveys by reducing the amount of human inspection required. In this work, we test the performance of supervised, semi-supervised, and unsupervised learning algorithms trained with the ResNetV2 neural network architecture on their ability to efficiently find strong gravitational lenses in the Deep Lens Survey (DLS). We use galaxy images from the survey, combined with simulated lensed sources, as labeled data in our training data sets. We find that models using semi-supervised learning along with data augmentations (transformations applied to an image during training, e.g. rotation) and Generative Adversarial Network (GAN) generated images yield the best performance. They offer 5 – 10 times better precision across all recall values compared to supervised algorithms. Applying the best performing models to the full 20 deg2 DLS survey, we find 3 Grade-A lens candidates within the top 17 image predictions from the model. This increases to 9 Grade-A and 13 Grade-B candidates when 1 per cent (∼2500 images) of the model predictions are visually inspected. This is ≳ 10 × the sky density of lens candidates compared to current shallower wide-area surveys (such as the Dark Energy Survey), indicating a trove of lenses awaiting discovery in upcoming deeper all-sky surveys. These results suggest that pipelines tasked with finding strong lens systems can be highly efficient, minimizing human effort. We additionally report spectroscopic confirmation of the lensing nature of two Grade-A candidates identified by our model, further validating our methods.

 
more » « less
Award ID(s):
2108515
NSF-PAR ID:
10439790
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
Oxford University Press
Date Published:
Journal Name:
Monthly Notices of the Royal Astronomical Society
Volume:
524
Issue:
4
ISSN:
0035-8711
Page Range / eLocation ID:
p. 5368-5390
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    We perform a search for galaxy–galaxy strong lens systems using a convolutional neural network (CNN) applied to imaging data from the first public data release of the DECam Local Volume Exploration Survey, which contains ∼520 million astronomical sources covering ∼4000 deg2of the southern sky to a 5σpoint–source depth ofg= 24.3,r= 23.9,i= 23.3, andz= 22.8 mag. Following the methodology of similar searches using Dark Energy Camera data, we apply color and magnitude cuts to select a catalog of ∼11 million extended astronomical sources. After scoring with our CNN, the highest-scoring 50,000 images were visually inspected and assigned a score on a scale from 0 (not a lens) to 3 (very probable lens). We present a list of 581 strong lens candidates, 562 of which are previously unreported. We categorize our candidates using their human-assigned scores, resulting in 55 Grade A candidates, 149 Grade B candidates, and 377 Grade C candidates. We additionally highlight eight potential quadruply lensed quasars from this sample. Due to the location of our search footprint in the northern Galactic cap (b> 10 deg) and southern celestial hemisphere (decl. < 0 deg), our candidate list has little overlap with other existing ground-based searches. Where our search footprint does overlap with other searches, we find a significant number of high-quality candidates that were previously unidentified, indicating a degree of orthogonality in our methodology. We report properties of our candidates including apparent magnitude and Einstein radius estimated from the image separation.

     
    more » « less
  2. ABSTRACT

    Gravitational time delays provide a powerful one-step measurement of H0, independent of all other probes. One key ingredient in time-delay cosmography are high-accuracy lens models. Those are currently expensive to obtain, both, in terms of computing and investigator time (105–106 CPU hours and ∼0.5–1 yr, respectively). Major improvements in modelling speed are therefore necessary to exploit the large number of lenses that are forecast to be discovered over the current decade. In order to bypass this roadblock, we develop an automated modelling pipeline and apply it to a sample of 31 lens systems, observed by the Hubble Space Telescope in multiple bands. Our automated pipeline can derive models for 30/31 lenses with few hours of human time and <100 CPU hours of computing time for a typical system. For each lens, we provide measurements of key parameters and predictions of magnification as well as time delays for the multiple images. We characterize the cosmography-readiness of our models using the stability of differences in the Fermat potential (proportional to time delay) with respect to modelling choices. We find that for 10/30 lenses, our models are cosmography or nearly cosmography grade (<3 per cent and 3–5 per cent variations). For 6/30 lenses, the models are close to cosmography grade (5–10 per cent). These results utilize informative priors and will need to be confirmed by further analysis. However, they are also likely to improve by extending the pipeline modelling sequence and options. In conclusion, we show that uniform cosmography grade modelling of large strong lens samples is within reach.

     
    more » « less
  3. ABSTRACT

    Despite the success of galaxy-scale strong gravitational lens studies with Hubble-quality imaging, a number of well-studied strong lenses remains small. As a result, robust comparisons of the lens models to theoretical predictions are difficult. This motivates our application of automated Bayesian lens modelling methods to observations from public data releases of overlapping large ground-based imaging and spectroscopic surveys: Kilo-Degree Survey (KiDS) and Galaxy and Mass Assembly (GAMA), respectively. We use the open-source lens modelling software pyautolens to perform our analysis. We demonstrate the feasibility of strong lens modelling with large-survey data at lower resolution as a complementary avenue to studies that utilize more time-consuming and expensive observations of individual lenses at higher resolution. We discuss advantages and challenges, with special consideration given to determining background source redshifts from single-aperture spectra and to disentangling foreground lens and background source light. High uncertainties in the best-fitting parameters for the models due to the limits of optical resolution in ground-based observatories and the small sample size can be improved with future study. We give broadly applicable recommendations for future efforts, and with proper application, this approach could yield measurements in the quantities needed for robust statistical inference.

     
    more » « less
  4. When strong gravitational lenses are to be used as an astrophysical or cosmological probe, models of their mass distributions are often needed. We present a new, time-efficient automation code for the uniform modeling of strongly lensed quasars withGLEE, a lens-modeling software for multiband data. By using the observed positions of the lensed quasars and the spatially extended surface brightness distribution of the host galaxy of the lensed quasar, we obtain a model of the mass distribution of the lens galaxy. We applied this uniform modeling pipeline to a sample of nine strongly lensed quasars for which images were obtained with the Wide Field Camera 3 of theHubbleSpace Telescope. The models show well-reconstructed light components and a good alignment between mass and light centroids in most cases. We find that the automated modeling code significantly reduces the input time during the modeling process for the user. The time for preparing the required input files is reduced by a factor of 3 from ~3 h to about one hour. The active input time during the modeling process for the user is reduced by a factor of 10 from ~ 10 h to about one hour per lens system. This automated uniform modeling pipeline can efficiently produce uniform models of extensive lens-system samples that can be used for further cosmological analysis. A blind test that compared our results with those of an independent automated modeling pipeline based on the modeling softwareLenstronomyrevealed important lessons. Quantities such as Einstein radius, astrometry, mass flattening, and position angle are generally robustly determined. Other quantities, such as the radial slope of the mass density profile and predicted time delays, depend crucially on the quality of the data and on the accuracy with which the point spread function is reconstructed. Better data and/or a more detailed analysis are necessary to elevate our automated models to cosmography grade. Nevertheless, our pipeline enables the quick selection of lenses for follow-up and further modeling, which significantly speeds up the construction of cosmography-grade models. This important step forward will help us to take advantage of the increase in the number of lenses that is expected in the coming decade, which is an increase of several orders of magnitude.

     
    more » « less
  5. Low surface brightness galaxies (LSBGs), galaxies that are fainter than the dark night sky, are famously difficult to detect. Nonetheless, studies of these galaxies are essential to improve our understanding of the formation and evolution of low-mass galaxies. In this work, we train a deep learning model using the Mask R-CNN framework on a set of simulated LSBGs inserted into images from the Dark Energy Survey (DES) Data Release 2 (DR2). This deep learning model is combined with several conventional image pre-processing steps to develop a pipeline for the detection of LSBGs. We apply this pipeline to the full DES DR2 coadd image dataset, and preliminary results show the detection of 22 large, high-quality LSBG candidates that went undetected by conventional algorithms. Furthermore, we find that the performance of our algorithm is greatly improved by including examples of false positives as an additional class during training. 
    more » « less