skip to main content


Title: TD-CARMA: Painless, Accurate, and Scalable Estimates of Gravitational Lens Time Delays with Flexible CARMA Processes
Abstract

Cosmological parameters encoding our understanding of the expansion history of the universe can be constrained by the accurate estimation of time delays arising in gravitationally lensed systems. We propose TD-CARMA, a Bayesian method to estimate cosmological time delays by modeling observed and irregularly sampled light curves as realizations of a continuous auto-regressive moving average (CARMA) process. Our model accounts for heteroskedastic measurement errors and microlensing, an additional source of independent extrinsic long-term variability in the source brightness. The semiseparable structure of the CARMA covariance matrix allows for fast and scalable likelihood computation using Gaussian process modeling. We obtain a sample from the joint posterior distribution of the model parameters using a nested sampling approach. This allows for “painless” Bayesian computation, dealing with the expected multimodality of the posterior distribution in a straightforward manner and not requiring the specification of starting values or an initial guess for the time delay, unlike existing methods. In addition, the proposed sampling procedure automatically evaluates the Bayesian evidence, allowing us to perform principled Bayesian model selection. TD-CARMA is parsimonious, and typically includes no more than a dozen unknown parameters. We apply TD-CARMA to six doubly lensed quasars HS2209+1914, SDSS J1001+5027, SDSS J1206+4332, SDSS J1515+1511, SDSS J1455+1447, and SDSS J1349+1227, estimating their time delays as −21.96 ± 1.448, 120.93 ± 1.015, 111.51 ± 1.452, 210.80 ± 2.18, 45.36 ± 1.93, and 432.05 ± 1.950, respectively. These estimates are consistent with those derived in the relevant literature, but are typically two to four times more precise.

 
more » « less
NSF-PAR ID:
10421088
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
DOI PREFIX: 10.3847
Date Published:
Journal Name:
The Astrophysical Journal
Volume:
950
Issue:
1
ISSN:
0004-637X
Format(s):
Medium: X Size: Article No. 37
Size(s):
["Article No. 37"]
Sponsoring Org:
National Science Foundation
More Like this
  1. When strong gravitational lenses are to be used as an astrophysical or cosmological probe, models of their mass distributions are often needed. We present a new, time-efficient automation code for the uniform modeling of strongly lensed quasars withGLEE, a lens-modeling software for multiband data. By using the observed positions of the lensed quasars and the spatially extended surface brightness distribution of the host galaxy of the lensed quasar, we obtain a model of the mass distribution of the lens galaxy. We applied this uniform modeling pipeline to a sample of nine strongly lensed quasars for which images were obtained with the Wide Field Camera 3 of theHubbleSpace Telescope. The models show well-reconstructed light components and a good alignment between mass and light centroids in most cases. We find that the automated modeling code significantly reduces the input time during the modeling process for the user. The time for preparing the required input files is reduced by a factor of 3 from ~3 h to about one hour. The active input time during the modeling process for the user is reduced by a factor of 10 from ~ 10 h to about one hour per lens system. This automated uniform modeling pipeline can efficiently produce uniform models of extensive lens-system samples that can be used for further cosmological analysis. A blind test that compared our results with those of an independent automated modeling pipeline based on the modeling softwareLenstronomyrevealed important lessons. Quantities such as Einstein radius, astrometry, mass flattening, and position angle are generally robustly determined. Other quantities, such as the radial slope of the mass density profile and predicted time delays, depend crucially on the quality of the data and on the accuracy with which the point spread function is reconstructed. Better data and/or a more detailed analysis are necessary to elevate our automated models to cosmography grade. Nevertheless, our pipeline enables the quick selection of lenses for follow-up and further modeling, which significantly speeds up the construction of cosmography-grade models. This important step forward will help us to take advantage of the increase in the number of lenses that is expected in the coming decade, which is an increase of several orders of magnitude.

     
    more » « less
  2. Abstract. Mesoscale dynamics in the mesosphere and lower thermosphere (MLT) region have been difficult to study from either ground- or satellite-based observations. For understanding of atmospheric coupling processes, important spatial scales at these altitudes range between tens and hundreds of kilometers in the horizontal plane. To date, this scale size is challenging observationally, so structures are usually parameterized in global circulation models. The advent of multistatic specular meteor radar networks allows exploration of MLT mesoscale dynamics on these scales using an increased number of detections and a diversity of viewing angles inherent to multistatic networks. In this work, we introduce a four-dimensional wind field inversion method that makes use of Gaussian process regression (GPR), which is a nonparametric and Bayesian approach. The method takes measured projected wind velocities and prior distributions of the wind velocity as a function of space and time, specified by the user or estimated from the data, and produces posterior distributions for the wind velocity. Computation of the predictive posterior distribution is performed on sampled points of interest and is not necessarily regularly sampled. The main benefits of the GPR method include this non-gridded sampling, the built-in statistical uncertainty estimates, and the ability to horizontally resolve winds on relatively small scales. The performance of the GPR implementation has been evaluated on Monte Carlo simulations with known distributions using the same spatial and temporal sampling as 1 d of real meteor measurements. Based on the simulation results we find that the GPR implementation is robust, providing wind fields that are statistically unbiased with statistical variances that depend on the geometry and are proportional to the prior velocity variances. A conservative and fast approach can be straightforwardly implemented by employing overestimated prior variances and distances, while a more robust but computationally intensive approach can be implemented by employing training and fitting of model hyperparameters. The latter GPR approach has been applied to a 24 h dataset and shown to compare well to previously used homogeneous and gradient methods. Small-scale features have reasonably low statistical uncertainties, implying geophysical wind field horizontal structures as low as 20–50 km. We suggest that this GPR approach forms a suitable method for MLT regional and weather studies. 
    more » « less
  3. Abstract

    Parameters in climate models are usually calibrated manually, exploiting only small subsets of the available data. This precludes both optimal calibration and quantification of uncertainties. Traditional Bayesian calibration methods that allow uncertainty quantification are too expensive for climate models; they are also not robust in the presence of internal climate variability. For example, Markov chain Monte Carlo (MCMC) methods typically requiremodel runs and are sensitive to internal variability noise, rendering them infeasible for climate models. Here we demonstrate an approach to model calibration and uncertainty quantification that requires onlymodel runs and can accommodate internal climate variability. The approach consists of three stages: (a) a calibration stage uses variants of ensemble Kalman inversion to calibrate a model by minimizing mismatches between model and data statistics; (b) an emulation stage emulates the parameter‐to‐data map with Gaussian processes (GP), using the model runs in the calibration stage for training; (c) a sampling stage approximates the Bayesian posterior distributions by sampling the GP emulator with MCMC. We demonstrate the feasibility and computational efficiency of this calibrate‐emulate‐sample (CES) approach in a perfect‐model setting. Using an idealized general circulation model, we estimate parameters in a simple convection scheme from synthetic data generated with the model. The CES approach generates probability distributions of the parameters that are good approximations of the Bayesian posteriors, at a fraction of the computational cost usually required to obtain them. Sampling from this approximate posterior allows the generation of climate predictions with quantified parametric uncertainties.

     
    more » « less
  4. ABSTRACT Strong lensed quasi-stellar objects (QSOs) are valuable probes of the Universe in numerous aspects. Two of these applications, reverberation mapping and measuring time delays for determining cosmological parameters, require the source QSOs to be variable with sufficient amplitude. In this paper, we forecast the number of strong lensed QSOs with sufficient variability to be detected by the Vera C. Rubin Telescope Legacy Survey of Space and Time (LSST). The damped random walk model is employed to model the variability amplitude of lensed QSOs taken from a mock catalogue by Oguri & Marshall (2010). We expect 30–40 per cent of the mock lensed QSO sample, which corresponds to ∼1000, to exhibit variability detectable with LSST. A smaller subsample of 250 lensed QSOs will show larger variability of >0.15 mag for bright lensed images with i < 21 mag, allowing for monitoring with smaller telescopes. We discuss systematic uncertainties in the prediction by considering alternative prescriptions for variability and mock lens catalogue with respect to our fiducial model. Our study shows that a large-scale survey of lensed QSOs can be conducted for reverberation mapping and time delay measurements following up on LSST. 
    more » « less
  5. Abstract In the problem of spotlight mode airborne synthetic aperture radar (SAR) image formation, it is well-known that data collected over a wide azimuthal angle violate the isotropic scattering property typically assumed. Many techniques have been proposed to account for this issue, including both full-aperture and sub-aperture methods based on filtering, regularized least squares, and Bayesian methods. A full-aperture method that uses a hierarchical Bayesian prior to incorporate appropriate speckle modeling and reduction was recently introduced to produce samples of the posterior density rather than a single image estimate. This uncertainty quantification information is more robust as it can generate a variety of statistics for the scene. As proposed, the method was not well-suited for large problems, however, as the sampling was inefficient. Moreover, the method was not explicitly designed to mitigate the effects of the faulty isotropic scattering assumption. In this work we therefore propose a new sub-aperture SAR imaging method that uses a sparse Bayesian learning-type algorithm to more efficiently produce approximate posterior densities for each sub-aperture window. These estimates may be useful in and of themselves, or when of interest, the statistics from these distributions can be combined to form a composite image. Furthermore, unlike the often-employed ℓ p -regularized least squares methods, no user-defined parameters are required. Application-specific adjustments are made to reduce the typically burdensome runtime and storage requirements so that appropriately large images can be generated. Finally, this paper focuses on incorporating these techniques into SAR image formation process, that is, for the problem starting with SAR phase history data, so that no additional processing errors are incurred. The advantage over existing SAR image formation methods are clearly presented with numerical experiments using real-world data. 
    more » « less