skip to main content


Title: A Fisher matrix for gravitational-wave population inference
ABSTRACT

We derive a Fisher matrix for the parameters characterizing a population of gravitational-wave events. This provides a guide to the precision with which population parameters can be estimated with multiple observations, which becomes increasingly accurate as the number of events and the signal-to-noise ratio of the sampled events increase. The formalism takes into account individual event measurement uncertainties and selection effects, and can be applied to arbitrary population models. We illustrate the framework with two examples: an analytical calculation of the Fisher matrix for the mean and variance of a Gaussian model describing a population affected by selection effects, and an estimation of the precision with which the slope of a power-law distribution of supermassive black hole masses can be measured using extreme-mass-ratio inspiral observations. We compare the Fisher predictions to results from Monte Carlo analyses, finding very good agreement.

 
more » « less
Award ID(s):
2207502
NSF-PAR ID:
10390221
Author(s) / Creator(s):
; ;
Publisher / Repository:
Oxford University Press
Date Published:
Journal Name:
Monthly Notices of the Royal Astronomical Society
Volume:
519
Issue:
2
ISSN:
0035-8711
Page Range / eLocation ID:
p. 2736-2753
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Ecological analyses typically involve many interacting variables. Ecologists often specify lagged interactions in community dynamics (i.e. vector‐autoregressive models) or simultaneous interactions (e.g. structural equation models), but there is less familiarity with dynamic structural equation models (DSEM) that can include any simultaneous or lagged effect in multivariate time‐series analysis.

    We propose a novel approach to parameter estimation for DSEM, which involves constructing a Gaussian Markov random field (GMRF) representing simultaneous and lagged path coefficients, and then fitting this as a generalized linear mixed model to missing and/or non‐normal data. We provide a new R‐packagedsem, which extends the ‘arrow interface’ from path analysis to represent user‐specified lags when constructing the GMRF. We also outline how the resulting nonseparable precision matrix can generalize existing separable models, for example, for time‐series and species interactions in a vector‐autoregressive model.

    We first demonstratedsemby simulating a two‐species vector‐autoregressive model based on wolf–moose interactions on Isle Royale. We show that DSEM has improved precision when data are missing relative to a conventional dynamic linear model. We then demonstrate DSEM via two contrasting case studies. The first identifies a trophic cascade where decreased sunflower starfish has increased urchin and decreased kelp densities, while sea otters have a simultaneous positive effect on kelp in the California Current from 1999 to 2018. The second estimates how declining sea ice has decreased cold‐water habitats, driving a decreased density for fall copepod predation and inhibiting early‐life survival for Alaska pollock from 1963 to 2023.

    We conclude that DSEM can be fitted efficiently as a GLMM involving missing data, while allowing users to specify both simultaneous and lagged effects in a time‐series structural model. DSEM then allows conceptual models (developed with stakeholder input or from ecological expertise) to be fitted to incomplete time series and provides a simple interface for granular control over the number of estimated time‐series parameters. Finally, computational methods are sufficiently simple that DSEM can be embedded as component within larger (e.g. integrated population) models. We therefore recommend greater exploration and performance testing for DSEM relative to familiar time‐series forecasting methods.

     
    more » « less
  2. ABSTRACT

    Line intensity mapping (LIM) experiments probing the nearby Universe can expect a considerable amount of cosmic infrared background (CIB) continuum emission from near and far-infrared galaxies. For the purpose of using LIM to constrain the star formation rate (SFR), we argue that the CIB continuum – traditionally treated as contamination – can be combined with the LIM signal to enhance the SFR constraints achievable. We first present a power spectrum model that combines continuum and line emissions assuming a common SFR model. We subsequently analyse the effectiveness of the joint model in the context of the EXperiment for Cryogenic Large-Aperture Intensity Mapping (EXCLAIM), which utilizes the $[{\rm C\, \small {II}}]$ molecular line to study the SFR. We numerically compute the theoretical power spectra according to our model and the EXCLAIM survey specifics, and perform Fisher analysis to forecast the SFR constraints. We find that although the joint model has no considerable advantage over LIM alone assuming the current survey level of EXCLAIM, its effects become significant when we consider more optimistic values of survey resolution and angular span that are expected of future LIM experiments. We show that the CIB is not only an additional SFR sensitive signal, but also serves to break the SFR parameter degeneracy that naturally emerges from the $[{\rm C\, \small {II}}]$ Fisher matrix. For this reason, addition of the CIB will allow improvements in the survey parameters to be better reflected in the SFR constraints, and can be effectively utilized by future LIM experiments.

     
    more » « less
  3. Abstract

    A new inductively coupled plasma atomic emission spectrometry (ICP‐AES) method is presented for rapid and routine analysis of Sr/Ca molar ratios in seawater, with a long‐term precision of < 0.2%. It is an adaptation of a method widely employed for the analysis of coral aragonite Sr/Ca ratios in marine paleothermometry studies, which are based on the assumption that the seawater Sr/Ca ratio is constant in space and time. While prior studies have shown variations of up to 1% with depth, smaller variations at the ocean surface are generally accounted for via empirical, species‐specific calibrations of coral Sr/Ca vs. temperature. We found Sr/Ca variations in some coastal waters to be even larger, with distinct periodicity, complicating this approach. Although the high precision necessary for measurements of seawater Sr/Ca has previously relied on advanced mass spectrometry, long analysis times, and expensive isotopic spikes, our method uses more accessible instrumentation and is both time‐ and cost‐saving. The intricate composition of seawater, relative to coral aragonite solutions, requires an intensity ratio calibration technique combined with rigorous normalization to a suitable seawater standard. Key aspects of our method are discussed, including the choice of wavelengths, instrument parameters, accuracy, precision, and matrix effects. Special attention is given to the need for a certified seawater Sr/Ca reference standard, which does not presently exist. Analytical validation is provided by concurrent sharp gradients in Sr/Ca and δ18O, coinciding with the Florida landfall of hurricane Irma, as recorded at near‐daily resolution in a continuous seawater sample collected with an osmotic pump.

     
    more » « less
  4. Abstract

    Gravitational-wave (GW) radiation from a coalescing compact binary is a standard siren, as the luminosity distance of each event can be directly measured from the amplitude of the signal. One possibility to constrain cosmology using the GW siren is to perform statistical inference on a population of binary black hole (BBH) events. In essence, this statistical method can be viewed as follows. We can modify the shape of the distribution of observed BBH events by changing the cosmological parameters until it eventually matches the distribution constructed from an astrophysical population model, thereby allowing us to determine the cosmological parameters. In this work, we derive the Cramér–Rao bound for both cosmological parameters and those governing the astrophysical population model from this statistical dark siren method by examining the Fisher information contained in the event distribution. Our study provides analytical insights and enables fast yet accurate estimations of the statistical accuracy of dark siren cosmology. Furthermore, we consider the bias in cosmology due to unmodeled substructures in the merger rate and mass distribution. We find that a 1% deviation in the astrophysical model can lead to a more than 1% error in the Hubble constant. This could limit the accuracy of dark siren cosmology when there are more than 104BBH events detected.

     
    more » « less
  5. null (Ed.)
    Introduction: Alzheimer’s disease (AD) causes progressive irreversible cognitive decline and is the leading cause of dementia. Therefore, a timely diagnosis is imperative to maximize neurological preservation. However, current treatments are either too costly or limited in availability. In this project, we explored using retinal vasculature as a potential biomarker for early AD diagnosis. This project focuses on stage 3 of a three-stage modular machine learning pipeline which consisted of image quality selection, vessel map generation, and classification [1]. The previous model only used support vector machine (SVM) to classify AD labels which limited its accuracy to 82%. In this project, random forest and gradient boosting were added and, along with SVM, combined into an ensemble classifier, raising the classification accuracy to 89%. Materials and Methods: Subjects classified as AD were those who were diagnosed with dementia in “Dementia Outcome: Alzheimer’s disease” from the UK Biobank Electronic Health Records. Five control groups were chosen with a 5:1 ratio of control to AD patients where the control patients had the same age, gender, and eye side image as the AD patient. In total, 122 vessel images from each group (AD and control) were used. The vessel maps were then segmented from fundus images through U-net. A t-test feature selection was first done on the training folds and the selected features was fed into the classifiers with a p-value threshold of 0.01. Next, 20 repetitions of 5-fold cross validation were performed where the hyperparameters were solely tuned on the training data. An ensemble classifier consisting of SVM, gradient boosting tree, and random forests was built and the final prediction was made through majority voting and evaluated on the test set. Results and Discussion: Through ensemble classification, accuracy increased by 4-12% relative to the individual classifiers, precision by 9-15%, sensitivity by 2-9%, specificity by at least 9-16%, and F1 score by 712%. Conclusions: Overall, a relatively high classification accuracy was achieved using machine learning ensemble classification with SVM, random forest, and gradient boosting. Although the results are very promising, a limitation of this study is that the requirement of needing images of sufficient quality decreased the amount of control parameters that can be implemented. However, through retinal vasculature analysis, this project shows machine learning’s high potential to be an efficient, more cost-effective alternative to diagnosing Alzheimer’s disease. Clinical Application: Using machine learning for AD diagnosis through retinal images will make screening available for a broader population by being more accessible and cost-efficient. Mobile device based screening can also be enabled at primary screening in resource-deprived regions. It can provide a pathway for future understanding of the association between biomarkers in the eye and brain. 
    more » « less