skip to main content

This content will become publicly available on March 15, 2023

Title: Dark Matter In Extreme Astrophysical Environments
Exploring dark matter via observations of extreme astrophysical environments -- defined here as heavy compact objects such as white dwarfs, neutron stars, and black holes, as well as supernovae and compact object merger events -- has been a major field of growth since the last Snowmass process. Theoretical work has highlighted the utility of current and near-future observatories to constrain novel dark matter parameter space across the full mass range. This includes gravitational wave instruments and observatories spanning the electromagnetic spectrum, from radio to gamma-rays. While recent searches already provide leading sensitivity to various dark matter models, this work also highlights the need for theoretical astrophysics research to better constrain the properties of these extreme astrophysical systems. The unique potential of these search signatures to probe dark matter adds motivation to proposed next-generation astronomical and gravitational wave instruments. Note: Contribution to Snowmass 2021 -- CF3. Dark Matter: Cosmic Probes
Authors:
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ;
Award ID(s):
2014215
Publication Date:
NSF-PAR ID:
10352562
Journal Name:
ArXivorg
Volume:
arXiv:2203.07984
ISSN:
2331-8422
Sponsoring Org:
National Science Foundation
More Like this
  1. ABSTRACT Joint multimessenger observations with gravitational waves and electromagnetic (EM) data offer new insights into the astrophysical studies of compact objects. The third Advanced LIGO and Advanced Virgo observing run began on 2019 April 1; during the 11 months of observation, there have been 14 compact binary systems candidates for which at least one component is potentially a neutron star. Although intensive follow-up campaigns involving tens of ground and space-based observatories searched for counterparts, no EM counterpart has been detected. Following on a previous study of the first six months of the campaign, we present in this paper the next five months of the campaign from 2019 October to 2020 March. We highlight two neutron star–black hole candidates (S191205ah and S200105ae), two binary neutron star candidates (S191213g and S200213t), and a binary merger with a possible neutron star and a ‘MassGap’ component, S200115j. Assuming that the gravitational-wave (GW) candidates are of astrophysical origin and their location was covered by optical telescopes, we derive possible constraints on the matter ejected during the events based on the non-detection of counterparts. We find that the follow-up observations during the second half of the third observing run did not meet the necessary sensitivity tomore »constrain the source properties of the potential GW candidate. Consequently, we suggest that different strategies have to be used to allow a better usage of the available telescope time. We examine different choices for follow-up surveys to optimize sky localization coverage versus observational depth to understand the likelihood of counterpart detection.« less
  2. Abstract

    The two interferometric LIGO gravitational-wave observatories provide the most sensitive data to date to study the gravitational-wave universe. As part of a global network, they have completed their third observing run in which they observed many tens of signals from merging compact binary systems. It has long been known that a limiting factor in identifying transient gravitational-wave signals is the presence of transient non-Gaussian noise, which reduce the ability of astrophysical searches to detect signals confidently. Significant efforts are taken to identify and mitigate this noise at the source, but its presence persists, leading to the need for software solutions. Taking a set of transient noise artefacts categorised by the GravitySpy software during the O3a observing era, we produce parameterised population models of the noise projected into the space of astrophysical model parameters of merging binary systems. We compare the inferred population properties of transient noise artefacts with observed astrophysical systems from the GWTC2.1 catalogue. We find that while the population of astrophysical systems tend to have near equal masses and moderate spins, transient noise artefacts are typically characterised by extreme mass ratios and large spins. This work provides a new method to calculate the consistency of an observedmore »candidate with a given class of noise artefacts. This approach could be used in assessing the consistency of candidates found by astrophysical searches (i.e. determining if they are consistent with a known glitch class). Furthermore, the approach could be incorporated into astrophysical searches directly, potentially improving the reach of the detectors, though only a detailed study would verify this.

    « less
  3. ABSTRACT

    Primordial black holes (PBHs) are dark matter candidates that span broad mass ranges from 10−17 M⊙ to ∼100 M⊙. We show that the stochastic gravitational wave background can be a powerful window for the detection of subsolar mass PBHs and shed light on their formation channel via third-generation gravitational wave detectors such as Cosmic Explorer and the Einstein Telescope. By using the mass distribution of the compact objects and the redshift evolution of the merger rates, we can distinguish astrophysical sources from PBHs and will be able to constrain the fraction of subsolar mass PBHs ≤1 M⊙ in the form of dark matter $f_\mathrm{PBH}\le 1{{\ \rm per\ cent}}$ at $68{{\ \rm per\ cent}}$ C.L. even for a pessimistic value of a binary suppression factor. In the absence of any suppression of the merger rate, constraints on fPBH will be less than $0.001{{\ \rm per\ cent}}$. Furthermore, we will be able to measure the redshift evolution of the PBH merger rate with about $1{{\ \rm per\ cent}}$ accuracy, making it possible to uniquely distinguish between the Poisson and clustered PBH scenarios.

  4. Abstract

    Gravitational-wave (GW) radiation from a coalescing compact binary is a standard siren, as the luminosity distance of each event can be directly measured from the amplitude of the signal. One possibility to constrain cosmology using the GW siren is to perform statistical inference on a population of binary black hole (BBH) events. In essence, this statistical method can be viewed as follows. We can modify the shape of the distribution of observed BBH events by changing the cosmological parameters until it eventually matches the distribution constructed from an astrophysical population model, thereby allowing us to determine the cosmological parameters. In this work, we derive the Cramér–Rao bound for both cosmological parameters and those governing the astrophysical population model from this statistical dark siren method by examining the Fisher information contained in the event distribution. Our study provides analytical insights and enables fast yet accurate estimations of the statistical accuracy of dark siren cosmology. Furthermore, we consider the bias in cosmology due to unmodeled substructures in the merger rate and mass distribution. We find that a 1% deviation in the astrophysical model can lead to a more than 1% error in the Hubble constant. This could limit the accuracy ofmore »dark siren cosmology when there are more than 104BBH events detected.

    « less
  5. Doglioni, C. ; Kim, D. ; Stewart, G.A. ; Silvestris, L. ; Jackson, P. ; Kamleh, W. (Ed.)
    For many scientific projects, data management is an increasingly complicated challenge. The number of data-intensive instruments generating unprecedented volumes of data is growing and their accompanying workflows are becoming more complex. Their storage and computing resources are heterogeneous and are distributed at numerous geographical locations belonging to different administrative domains and organisations. These locations do not necessarily coincide with the places where data is produced nor where data is stored, analysed by researchers, or archived for safe long-term storage. To fulfil these needs, the data management system Rucio has been developed to allow the high-energy physics experiment ATLAS at LHC to manage its large volumes of data in an efficient and scalable way. But ATLAS is not alone, and several diverse scientific projects have started evaluating, adopting, and adapting the Rucio system for their own needs. As the Rucio community has grown, many improvements have been introduced, customisations have been added, and many bugs have been fixed. Additionally, new dataflows have been investigated and operational experiences have been documented. In this article we collect and compare the common successes, pitfalls, and oddities that arose in the evaluation efforts of multiple diverse experiments, and compare them with the ATLAS experience. Thismore »includes the high-energy physics experiments Belle II and CMS, the neutrino experiment DUNE, the scattering radar experiment EISCAT3D, the gravitational wave observatories LIGO and VIRGO, the SKA radio telescope, and the dark matter search experiment XENON.« less