skip to main content


Search for: All records

Creators/Authors contains: "Despali, G"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract We present an analysis of seven strongly gravitationally lensed quasars and the corresponding constraints on the properties of dark matter. Our results are derived by modelling the lensed image positions and flux-ratios using a combination of smooth macro models and a population of low-mass haloes within the mass range 106 to 109 M⊙. Our lens models explicitly include higher-order complexity in the form of stellar discs and luminous satellites, as well as low-mass haloes located along the observed lines of sight for the first time. Assuming a Cold Dark Matter (CDM) cosmology, we infer an average total mass fraction in substructure of $f_{\rm sub} = 0.012^{+0.007}_{-0.004}$ (68 per cent confidence limits), which is in agreement with the predictions from CDM hydrodynamical simulations to within 1σ. This result is closer to the predictions than those from previous studies that did not include line-of-sight haloes. Under the assumption of a thermal relic dark matter model, we derive a lower limit on the particle relic mass of mth > 5.58 keV (95 per cent confidence limits), which is consistent with a value of mth > 5.3 keV from the recent analysis of the Lyα forest. We also identify two main sources of possible systematic errors and conclude that deeper investigations in the complex structure of lens galaxies as well as the size of the background sources should be a priority for this field. 
    more » « less
  2. null (Ed.)
    ABSTRACT In recent years, breakthroughs in methods and data have enabled gravitational time delays to emerge as a very powerful tool to measure the Hubble constant H0. However, published state-of-the-art analyses require of order 1 yr of expert investigator time and up to a million hours of computing time per system. Furthermore, as precision improves, it is crucial to identify and mitigate systematic uncertainties. With this time delay lens modelling challenge, we aim to assess the level of precision and accuracy of the modelling techniques that are currently fast enough to handle of order 50 lenses, via the blind analysis of simulated data sets. The results in Rungs 1 and 2 show that methods that use only the point source positions tend to have lower precision ($10\!-\!20{{\ \rm per\ cent}}$) while remaining accurate. In Rung 2, the methods that exploit the full information of the imaging and kinematic data sets can recover H0 within the target accuracy (|A| < 2 per cent) and precision (<6 per cent per system), even in the presence of a poorly known point spread function and complex source morphology. A post-unblinding analysis of Rung 3 showed the numerical precision of the ray-traced cosmological simulations to be insufficient to test lens modelling methodology at the percent level, making the results difficult to interpret. A new challenge with improved simulations is needed to make further progress in the investigation of systematic uncertainties. For completeness, we present the Rung 3 results in an appendix and use them to discuss various approaches to mitigating against similar subtle data generation effects in future blind challenges. 
    more » « less