skip to main content


Search for: All records

Award ID contains: 1715122

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract The accurate estimation of photometric redshifts is crucial to many upcoming galaxy surveys, for example, the Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST). Almost all Rubin extragalactic and cosmological science requires accurate and precise calculation of photometric redshifts; many diverse approaches to this problem are currently in the process of being developed, validated, and tested. In this work, we use the photometric redshift code GPz to examine two realistically complex training set imperfections scenarios for machine learning based photometric redshift calculation: (i) where the spectroscopic training set has a very different distribution in color–magnitude space to the test set, and (ii) where the effect of emission line confusion causes a fraction of the training spectroscopic sample to not have the true redshift. By evaluating the sensitivity of GPz to a range of increasingly severe imperfections, with a range of metrics (both of photo- z point estimates as well as posterior probability distribution functions, PDFs), we quantify the degree to which predictions get worse with higher degrees of degradation. In particular, we find that there is a substantial drop-off in photo- z quality when line-confusion goes above ∼1%, and sample incompleteness below a redshift of 1.5, for an experimental setup using data from the Buzzard Flock synthetic sky catalogs. 
    more » « less
  2. Abstract Vera C. Rubin Observatory is a ground-based astronomical facility under construction, a joint project of the National Science Foundation and the U.S. Department of Energy, designed to conduct a multipurpose 10 yr optical survey of the Southern Hemisphere sky: the Legacy Survey of Space and Time. Significant flexibility in survey strategy remains within the constraints imposed by the core science goals of probing dark energy and dark matter, cataloging the solar system, exploring the transient optical sky, and mapping the Milky Way. The survey’s massive data throughput will be transformational for many other astrophysics domains and Rubin’s data access policy sets the stage for a huge community of potential users. To ensure that the survey science potential is maximized while serving as broad a community as possible, Rubin Observatory has involved the scientific community at large in the process of setting and refining the details of the observing strategy. The motivation, history, and decision-making process of this strategy optimization are detailed in this paper, giving context to the science-driven proposals and recommendations for the survey strategy included in this Focus Issue. 
    more » « less
  3. Abstract Trans-Neptunian objects provide a window into the history of the solar system, but they can be challenging to observe due to their distance from the Sun and relatively low brightness. Here we report the detection of 75 moving objects that we could not link to any other known objects, the faintest of which has a VR magnitude of 25.02 ± 0.93 using the Kernel-Based Moving Object Detection (KBMOD) platform. We recover an additional 24 sources with previously known orbits. We place constraints on the barycentric distance, inclination, and longitude of ascending node of these objects. The unidentified objects have a median barycentric distance of 41.28 au, placing them in the outer solar system. The observed inclination and magnitude distribution of all detected objects is consistent with previously published KBO distributions. We describe extensions to KBMOD, including a robust percentile-based lightcurve filter, an in-line graphics-processing unit filter, new coadded stamp generation, and a convolutional neural network stamp filter, which allow KBMOD to take advantage of difference images. These enhancements mark a significant improvement in the readiness of KBMOD for deployment on future big data surveys such as LSST. 
    more » « less
  4. Deep learning (DL) models have achieved paradigm-changing performance in many fields with high dimensional data, such as images, audio, and text. However, the black-box nature of deep neural networks is not only a barrier to adoption in applications such as medical diagnosis, where interpretability is essential, but it also impedes diagnosis of under performing models. The task of diagnosing or explaining DL models requires the computation of additional artifacts, such as activation values and gradients. These artifacts are large in volume, and their computation, storage, and querying raise significant data management challenges. In this paper, we develop a novel data sampling technique that produces approximate but accurate results for these model debugging queries. Our sampling technique utilizes the lower dimension representation learned by the DL model and focuses on model decision boundaries for the data in this lower dimensional space. 
    more » « less
  5. The goal of the SunPy project is to facilitate and promote the use and development of community-led, free, and open source data analysis software for solar physics based on the scientific Python environment. The project achieves this goal by developing and maintaining the sunpy core package and supporting an ecosystem of affiliated packages. This paper describes the first official stable release (version 1.0) of the core package, as well as the project organization and infrastructure. This paper concludes with a discussion of the future of the SunPy project. 
    more » « less