Abstract Ground-based high-resolution cross-correlation spectroscopy (HRCCS;R ≳ 15,000) is a powerful complement to space-based studies of exoplanet atmospheres. By resolving individual spectral lines, HRCCS can precisely measure chemical abundance ratios, directly constrain atmospheric dynamics, and robustly probe multidimensional physics. But the subtleties of HRCCS data sets—e.g., the lack of exoplanetary spectra visible by eye and the statistically complex process of telluric removal—can make interpreting them difficult. In this work, we seek to clarify the uncertainty budget of HRCCS with a forward-modeling approach. We present an HRCCS observation simulator,scope,55https://github.com/arjunsavel/scopethat incorporates spectral contributions from the exoplanet, star, tellurics, and instrument. This tool allows us to control the underlying data set, enabling controlled experimentation with complex HRCCS methods. Simulating a fiducial hot Jupiter data set (WASP-77Ab emission with IGRINS), we first confirm via multiple tests that the commonly used principal component analysis does not bias the planetary signal when few components are used. Furthermore, we demonstrate that mildly varying tellurics and moderate wavelength solution errors induce only mild decreases in HRCCS detection significance. However, limiting-case, strongly varying tellurics can bias the retrieved velocities and gas abundances. Additionally, in the low signal-to-noise ratio limit, constraints on gas abundances become highly non-Gaussian. Our investigation of the uncertainties and potential biases inherent in HRCCS data analysis enables greater confidence in scientific results from this maturing method.
more »
« less
OrbitN: A Symplectic Integrator for Planetary Systems Dominated by a Central Mass—Insight into Long-term Solar System Chaos
Abstract Reliable studies of the long-term dynamics of planetary systems require numerical integrators that are accurate and fast. The challenge is often formidable because the chaotic nature of many systems requires relative numerical error bounds at or close to machine precision (∼10−16, double-precision arithmetic); otherwise, numerical chaos may dominate over physical chaos. Currently, the speed/accuracy demands are usually only met by symplectic integrators. For example, the most up-to-date long-term astronomical solutions for the solar system in the past (widely used in, e.g., astrochronology and high-precision geological dating) have been obtained using symplectic integrators. However, the source codes of these integrators are unavailable. Here I present the symplectic integratororbitN(lean version 1.0) with the primary goal of generating accurate and reproducible long-term orbital solutions for near-Keplerian planetary systems (here the solar system) with a dominant massM0. Among other features,orbitN-1.0includesM0’s quadrupole moment, a lunar contribution, and post-Newtonian corrections (1PN) due toM0(fast symplectic implementation). To reduce numerical round-off errors, Kahan compensated summation was implemented. I useorbitNto provide insight into the effect of various processes on the long-term chaos in the solar system. Notably, 1PN corrections have the opposite effect on chaoticity/stability on a 100 Myr versus Gyr timescale. For the current application,orbitNis about as fast as or faster (factor 1.15–2.6) than comparable integrators, depending on hardware.11The orbitN source code (C) is available athttp://github.com/rezeebe/orbitN.
more »
« less
- Award ID(s):
- 2001022
- PAR ID:
- 10420289
- Publisher / Repository:
- DOI PREFIX: 10.3847
- Date Published:
- Journal Name:
- The Astronomical Journal
- Volume:
- 166
- Issue:
- 1
- ISSN:
- 0004-6256
- Format(s):
- Medium: X Size: Article No. 1
- Size(s):
- Article No. 1
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract High-fidelity simulators that connect theoretical models with observations are indispensable tools in many sciences. If the likelihood is known, inference can proceed using standard techniques. However, when the likelihood is intractable or unknown, a simulator makes it possible to infer the parameters of a theoretical model directly from real and simulated observations when coupled with machine learning. We introduce an extension of the recently proposed likelihood-free frequentist inference (LF2I) approach that makes it possible to construct confidence sets with thep-value function and to use the same function to check the coverage explicitly at any given parameter point. LikeLF2I, this extension yields provably valid confidence sets in parameter inference problems for which a high-fidelity simulator is available. The utility of our algorithm is illustrated by applying it to three pedagogically interesting examples: the first is from cosmology, the second from high-energy physics and astronomy, both with tractable likelihoods, while the third, with an intractable likelihood, is from epidemiology33Code to reproduce all of our results is available onhttps://github.com/AliAlkadhim/ALFFI..more » « less
-
Abstract In this work, we present classification results on early supernova light curves from SCONE, a photometric classifier that uses convolutional neural networks to categorize supernovae (SNe) by type using light-curve data. SCONE is able to identify SN types from light curves at any stage, from the night of initial alert to the end of their lifetimes. Simulated LSST SNe light curves were truncated at 0, 5, 15, 25, and 50 days after the trigger date and used to train Gaussian processes in wavelength and time space to produce wavelength–time heatmaps. SCONE uses these heatmaps to perform six-way classification between SN types Ia, II, Ibc, Ia-91bg, Iax, and SLSN-I. SCONE is able to perform classification with or without redshift, but we show that incorporating redshift information improves performance at each epoch. SCONE achieved 75% overall accuracy at the date of trigger (60% without redshift), and 89% accuracy 50 days after trigger (82% without redshift). SCONE was also tested on bright subsets of SNe (r< 20 mag) and produced 91% accuracy at the date of trigger (83% without redshift) and 95% five days after trigger (94.7% without redshift). SCONE is the first application of convolutional neural networks to the early-time photometric transient classification problem. All of the data processing and model code developed for this paper can be found in the SCONE software package11github.com/helenqu/sconelocated at github.com/helenqu/scone (Qu 2021).more » « less
-
Abstract Precise and accurate predictions of the halo mass function for cluster mass scales inwνCDM cosmologies are crucial for extracting robust and unbiased cosmological information from upcoming galaxy cluster surveys.Here, we present a halo mass function emulator for cluster mass scales (≳ 1013M⊙/h) up to redshiftz= 2 with comprehensive support for the parameter space ofwνCDM cosmologies allowed by current data.Based on theAemulusνsuite of simulations, the emulator marks a significant improvement in the precision of halo mass function predictions by incorporating both massive neutrinos and non-standard dark energy equation of state models.This allows for accurate modeling of the cosmology dependence in large-scale structure and galaxy cluster studies.We show that the emulator, designed using Gaussian Process Regression, has negligible theoretical uncertainties compared to dominant sources of error in future cluster abundance studies.Our emulator is publicly available (https://github.com/DelonShen/aemulusnu_hmf), providing the community with a crucial tool for upcoming cosmological surveys such as LSST and Euclid.more » « less
-
Abstract We present a scalable, cloud-based science platform solution designed to enable next-to-the-data analyses of terabyte-scale astronomical tabular data sets. The presented platform is built on Amazon Web Services (over Kubernetes and S3 abstraction layers), utilizes Apache Spark and the Astronomy eXtensions for Spark for parallel data analysis and manipulation, and provides the familiar JupyterHub web-accessible front end for user access. We outline the architecture of the analysis platform, provide implementation details and rationale for (and against) technology choices, verify scalability through strong and weak scaling tests, and demonstrate usability through an example science analysis of data from the Zwicky Transient Facility’s 1Bn+ light-curve catalog. Furthermore, we show how this system enables an end user to iteratively build analyses (in Python) that transparently scale processing with no need for end-user interaction. The system is designed to be deployable by astronomers with moderate cloud engineering knowledge, or (ideally) IT groups. Over the past 3 yr, it has been utilized to build science platforms for the DiRAC Institute, the ZTF partnership, the LSST Solar System Science Collaboration, and the LSST Interdisciplinary Network for Collaboration and Computing, as well as for numerous short-term events (with over 100 simultaneous users). In a live demo instance, the deployment scripts, source code, and cost calculators are accessible.44http://hub.astronomycommons.org/more » « less
An official website of the United States government
