skip to main content


Search for: All records

Creators/Authors contains: "Hartley, W G"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. ABSTRACT

    In this work, we explore the possibility of applying machine learning methods designed for 1D problems to the task of galaxy image classification. The algorithms used for image classification typically rely on multiple costly steps, such as the point spread function deconvolution and the training and application of complex Convolutional Neural Networks of thousands or even millions of parameters. In our approach, we extract features from the galaxy images by analysing the elliptical isophotes in their light distribution and collect the information in a sequence. The sequences obtained with this method present definite features allowing a direct distinction between galaxy types. Then, we train and classify the sequences with machine learning algorithms, designed through the platform Modulos AutoML. As a demonstration of this method, we use the second public release of the Dark Energy Survey (DES DR2). We show that we are able to successfully distinguish between early-type and late-type galaxies, for images with signal-to-noise ratio greater than 300. This yields an accuracy of $86{{\ \rm per\ cent}}$ for the early-type galaxies and $93{{\ \rm per\ cent}}$ for the late-type galaxies, which is on par with most contemporary automated image classification approaches. The data dimensionality reduction of our novel method implies a significant lowering in computational cost of classification. In the perspective of future data sets obtained with e.g. Euclid and the Vera Rubin Observatory, this work represents a path towards using a well-tested and widely used platform from industry in efficiently tackling galaxy classification problems at the peta-byte scale.

     
    more » « less
  2. We measure the impact of source galaxy clustering on higher order summary statistics of weak gravitational lensing data. By comparing simulated data with galaxies that either trace or do not trace the underlying density field, we show that this effect can exceed measurement uncertainties for common higher order statistics for certain analysis choices. We evaluate the impact on different weak lensing observables, finding that third moments and wavelet phase harmonics are more affected than peak count statistics. Using Dark Energy Survey (DES) Year 3 (Y3) data, we construct null tests for the source-clustering-free case, finding a p-value of p = 4 × 10−3 (2.6σ) using third-order map moments and p = 3 × 10−11 (6.5σ) using wavelet phase harmonics. The impact of source clustering on cosmological inference can be either included in the model or minimized through ad hoc procedures (e.g. scale cuts). We verify that the procedures adopted in existing DES Y3 cosmological analyses were sufficient to render this effect negligible. Failing to account for source clustering can significantly impact cosmological inference from higher order gravitational lensing statistics, e.g. higher order N-point functions, wavelet-moment observables, and deep learning or field-level summary statistics of weak lensing maps. 
    more » « less
  3. ABSTRACT

    Widefield surveys probe clustered scalar fields – such as galaxy counts, lensing potential, etc. – which are sensitive to different cosmological and astrophysical processes. Constraining such processes depends on the statistics that summarize the field. We explore the cumulative distribution function (CDF) as a summary of the galaxy lensing convergence field. Using a suite of N-body light-cone simulations, we show the CDFs’ constraining power is modestly better than the second and third moments, as CDFs approximately capture information from all moments. We study the practical aspects of applying CDFs to data, using the Dark Energy Survey (DES Y3) data as an example, and compute the impact of different systematics on the CDFs. The contributions from the point spread function and reduced shear approximation are $\lesssim 1~{{\ \rm per\ cent}}$ of the total signal. Source clustering effects and baryon imprints contribute 1–10 per cent. Enforcing scale cuts to limit systematics-driven biases in parameter constraints degrade these constraints a noticeable amount, and this degradation is similar for the CDFs and the moments. We detect correlations between the observed convergence field and the shape noise field at 13σ. The non-Gaussian correlations in the noise field must be modelled accurately to use the CDFs, or other statistics sensitive to all moments, as a rigorous cosmology tool.

     
    more » « less
  4. ABSTRACT

    We present an alternative calibration of the MagLim lens sample redshift distributions from the Dark Energy Survey (DES) first 3 yr of data (Y3). The new calibration is based on a combination of a self-organizing-map-based scheme and clustering redshifts to estimate redshift distributions and inherent uncertainties, which is expected to be more accurate than the original DES Y3 redshift calibration of the lens sample. We describe in detail the methodology, and validate it on simulations and discuss the main effects dominating our error budget. The new calibration is in fair agreement with the fiducial DES Y3 n(z) calibration, with only mild differences (<3σ) in the means and widths of the distributions. We study the impact of this new calibration on cosmological constraints, analysing DES Y3 galaxy clustering and galaxy–galaxy lensing measurements, assuming a Lambda cold dark matter cosmology. We obtain Ωm = 0.30 ± 0.04, σ8 = 0.81 ± 0.07, and S8 = 0.81 ± 0.04, which implies a ∼0.4σ shift in the Ω − S8 plane compared to the fiducial DES Y3 results, highlighting the importance of the redshift calibration of the lens sample in multiprobe cosmological analyses.

     
    more » « less
  5. ABSTRACT

    We present a method for mapping variations between probability distribution functions and apply this method within the context of measuring galaxy redshift distributions from imaging survey data. This method, which we name PITPZ for the probability integral transformations it relies on, uses a difference in curves between distribution functions in an ensemble as a transformation to apply to another distribution function, thus transferring the variation in the ensemble to the latter distribution function. This procedure is broadly applicable to the problem of uncertainty propagation. In the context of redshift distributions, for example, the uncertainty contribution due to certain effects can be studied effectively only in simulations, thus necessitating a transfer of variation measured in simulations to the redshift distributions measured from data. We illustrate the use of PITPZ by using the method to propagate photometric calibration uncertainty to redshift distributions of the Dark Energy Survey Year 3 weak lensing source galaxies. For this test case, we find that PITPZ yields a lensing amplitude uncertainty estimate due to photometric calibration error within 1 per cent of the truth, compared to as much as a 30 per cent underestimate when using traditional methods.

     
    more » « less