skip to main content


Title: The Planetary Nebula Luminosity Function in the Era of Precision Cosmology
One of the great surprises of the late 1980s was the discovery that the [O III] λ 5007 planetary nebula luminosity function (PNLF) could be used as a precision extragalactic standard candle. Despite the lack of any robust theory for the phenomenon, the technique passed a myriad of internal and external tests, and became an extremely reliable tool for obtaining distances to large galaxies within ∼ 20 Mpc. But in more recent years, the use of the technique has declined, due in part to the changing landscape of cosmology. Here we review the history of the PNLF, the experiments that confirmed its utility, and the reasons why interest in the method faded at the turn of the millennium. We also describe how and why the PNLF is making a comeback, and present some of the method’s recent results. Finally, we discuss how the PNLF must be analyzed in the era of precision cosmology, and detail the issues that must be overcome in order to address the current tension between local measures of the Hubble constant and values derived from the microwave background. If these issues can be understood, then the PNLF can provide a useful cross-check on distance measurements out to ∼ 40  Mpc.  more » « less
Award ID(s):
2206090
NSF-PAR ID:
10428498
Author(s) / Creator(s):
Date Published:
Journal Name:
Frontiers in Astronomy and Space Sciences
Volume:
9
ISSN:
2296-987X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. ABSTRACT

    Feedback from active galactic nuclei and stellar processes changes the matter distribution on small scales, leading to significant systematic uncertainty in weak lensing constraints on cosmology. We investigate how the observable properties of group-scale haloes can constrain feedback’s impact on the matter distribution using Cosmology and Astrophysics with MachinE Learning Simulations (CAMELS). Extending the results of previous work to smaller halo masses and higher wavenumber, k, we find that the baryon fraction in haloes contains significant information about the impact of feedback on the matter power spectrum. We explore how the thermal Sunyaev Zel’dovich (tSZ) signal from group-scale haloes contains similar information. Using recent Dark Energy Survey weak lensing and Atacama Cosmology Telescope tSZ cross-correlation measurements and models trained on CAMELS, we obtain 10 per cent constraints on feedback effects on the power spectrum at $k \sim 5\, h\, {\rm Mpc}^{-1}$. We show that with future surveys, it will be possible to constrain baryonic effects on the power spectrum to $\mathcal {O}(\lt 1~{{\ \rm per\ cent}})$ at $k = 1\, h\, {\rm Mpc}^{-1}$ and $\mathcal {O}(3~{{\ \rm per\ cent}})$ at $k = 5\, h\, {\rm Mpc}^{-1}$ using the methods that we introduce here. Finally, we investigate the impact of feedback on the matter bispectrum, finding that tSZ observables are highly informative in this case.

     
    more » « less
  2. ABSTRACT

    Weak galaxy lensing surveys have consistently reported low values of the S8 parameter compared to the Planck lambda cold dark matter (ΛCDM) cosmology. Amon & Efstathiou used KiDS-1000 cosmic shear measurements to propose that this tension can be reconciled if the matter fluctuation spectrum is suppressed more strongly on non-linear scales than assumed in state-of-the-art hydrodynamical simulations. In this paper, we investigate cosmic shear data from the Dark Energy Survey (DES) Year 3. The non-linear suppression of the matter power spectrum required to resolve the S8 tension between DES and the Planck ΛCDM model is not as strong as inferred using KiDS data, but is still more extreme than predictions from recent numerical simulations. An alternative possibility is that non-standard dark matter contributes to the required suppression. We investigate the redshift and scale dependence of the suppression of the matter power spectrum. If our proposed explanation of the S8 tension is correct, the required suppression must extend into the mildly non-linear regime to wavenumbers $k\sim 0.2 \, h\, {\rm Mpc}^{-1}$. In addition, all measures of S8 using linear scales should agree with the Planck ΛCDM cosmology, an expectation that will be testable to high precision in the near future.

     
    more » « less
  3. null (Ed.)
    ABSTRACT In 21-cm cosmology, precision calibration is key to the separation of the neutral hydrogen signal from very bright but spectrally smooth astrophysical foregrounds. The Hydrogen Epoch of Reionization Array (HERA), an interferometer specialized for 21-cm cosmology and now under construction in South Africa, was designed to be largely calibrated using the self-consistency of repeated measurements of the same interferometric modes. This technique, known as redundant-baseline calibration resolves most of the internal degrees of freedom in the calibration problem. It assumes, however, on antenna elements with identical primary beams placed precisely on a redundant grid. In this work, we review the detailed implementation of the algorithms enabling redundant-baseline calibration and report results with HERA data. We quantify the effects of real-world non-redundancy and how they compare to the idealized scenario in which redundant measurements differ only in their noise realizations. Finally, we study how non-redundancy can produce spurious temporal structure in our calibration solutions – both in data and in simulations – and present strategies for mitigating that structure. 
    more » « less
  4. Archaeomagnetic dating is a firmly established dating technique applicable to a wide variety of heat-treated anthropological materials and is advantageous for sites that lack materials suitable for radiocarbon dating. To correct recent misinterpretations of the method, we provide examples of how archaeomagnetic dating curves are calibrated and show how, in some instances, the technique can provide superior results. We emphasize that no single dating technique is capable of resolving the challenging chronology controversies in the Levant, and instead argue that multiple dating methods must be integrated in order to achieve the highest possible temporal resolution. 
    more » « less
  5. Abstract

    We present a computational framework that integrates forecasting, uncertainty quantification, and model predictive control (MPC) to benchmark the performance of deterministic and stochastic MPC. By means of a battery management case study, we illustrate how off‐the‐shelf deterministic MPC implementations can suffer significant losses in performance and constraint violations due to their inability to handle disturbances that cannot be adequately represented by mean (most likely) forecasts. We also show that adding constraint back‐off terms can help ameliorate these issues but this approach is ad hoc and does not provide performance guarantees. Stochastic MPC provides a more systematic framework to handle these issues by directly capturing uncertainty descriptions of a wide range of disturbances.

     
    more » « less