skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Friday, July 11 until 2:00 AM ET on Saturday, July 12 due to maintenance. We apologize for the inconvenience.


Title: Citizen science for IceCube: Name that Neutrino
Abstract Name that Neutrinois a citizen science project where volunteers aid in classification of events for the IceCube Neutrino Observatory, an immense particle detector at the geographic South Pole. From March 2023 to September 2023, volunteers did classifications of videos produced from simulated data of both neutrino signal and background interactions.Name that Neutrinoobtained more than 128,000 classifications by over 1800 registered volunteers that were compared to results obtained by a deep neural network machine-learning algorithm. Possible improvements for bothName that Neutrinoand the deep neural network are discussed.  more » « less
Award ID(s):
2310051 2209445 2310050 1847827
PAR ID:
10515657
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; more » ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; « less
Publisher / Repository:
Springer Science + Business Media
Date Published:
Journal Name:
The European Physical Journal Plus
Volume:
139
Issue:
6
ISSN:
2190-5444
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. ABSTRACT We present Galaxy Zoo DECaLS: detailed visual morphological classifications for Dark Energy Camera Legacy Survey images of galaxies within the SDSS DR8 footprint. Deeper DECaLS images (r = 23.6 versus r = 22.2 from SDSS) reveal spiral arms, weak bars, and tidal features not previously visible in SDSS imaging. To best exploit the greater depth of DECaLS images, volunteers select from a new set of answers designed to improve our sensitivity to mergers and bars. Galaxy Zoo volunteers provide 7.5 million individual classifications over 314 000 galaxies. 140 000 galaxies receive at least 30 classifications, sufficient to accurately measure detailed morphology like bars, and the remainder receive approximately 5. All classifications are used to train an ensemble of Bayesian convolutional neural networks (a state-of-the-art deep learning method) to predict posteriors for the detailed morphology of all 314 000 galaxies. We use active learning to focus our volunteer effort on the galaxies which, if labelled, would be most informative for training our ensemble. When measured against confident volunteer classifications, the trained networks are approximately 99 per cent accurate on every question. Morphology is a fundamental feature of every galaxy; our human and machine classifications are an accurate and detailed resource for understanding how galaxies evolve. 
    more » « less
  2. Abstract The Gravity Spy project aims to uncover the origins of glitches, transient bursts of noise that hamper analysis of gravitational-wave data. By using both the work of citizen-science volunteers and machine learning algorithms, the Gravity Spy project enables reliable classification of glitches. Citizen science and machine learning are intrinsically coupled within the Gravity Spy framework, with machine learning classifications providing a rapid first-pass classification of the dataset and enabling tiered volunteer training, and volunteer-based classifications verifying the machine classifications, bolstering the machine learning training set and identifying new morphological classes of glitches. These classifications are now routinely used in studies characterizing the performance of the LIGO gravitational-wave detectors. Providing the volunteers with a training framework that teaches them to classify a wide range of glitches, as well as additional tools to aid their investigations of interesting glitches, empowers them to make discoveries of new classes of glitches. This demonstrates that, when giving suitable support, volunteers can go beyond simple classification tasks to identify new features in data at a level comparable to domain experts. The Gravity Spy project is now providing volunteers with more complicated data that includes auxiliary monitors of the detector to identify the root cause of glitches. 
    more » « less
  3. ABSTRACT Camera trap studies have become a popular medium to assess many ecological phenomena including population dynamics, patterns of biodiversity, and monitoring of endangered species. In conjunction with the benefit to scientists, camera traps present an unprecedented opportunity to involve the public in scientific research via image classifications. However, this engagement strategy comes with a myriad of complications. Volunteers vary in their familiarity with wildlife, thus, the accuracy of user‐derived classifications may be biased by the commonness or popularity of species and user‐experience. From an extensive multi‐site camera trap study across Michigan, U.S.A, we compiled and classified images through a public science platform called Michigan ZoomIN. We aggregated responses from 15 independent users per image using multiple consensus methods to assess accuracy by comparing to species identification completed by wildlife experts. We also evaluated how different factors including consensus algorithms, study area, wildlife species, user support, and camera type influenced the accuracy of user‐derived classifications. Overall accuracy of user‐derived classification was 97%; although, several canid (e.g.,Canis lupus, Vulpes vulpes) and mustelid (e.g.,Neovison vison) species were repeatedly difficult to identify by users and had lower accuracy. When validating user‐derived classification, we found that study area, consensus method, and user support best explained accuracy. To overcome hesitancy associated with data collected by untrained participants, we demonstrated their value by showing that the accuracy from volunteers was comparable to experts when classifying North American mammals. Our hierarchical workflow that integrated multiple consensus methods led to more image classifications without extensive training and even when the expertise of the volunteer was unknown. Ultimately, adopting such an approach can harness broader participation, expedite future camera trap data synthesis, and improve allocation of resources by scholars to enhance performance of public participants and increase accuracy of user‐derived data. © 2021 The Wildlife Society. 
    more » « less
  4. Abstract We present 3 yr of high-contrast imaging of the PDS 70 b and c accreting protoplanets with the new extreme AO system MagAO-X as part of the MaxProtoPlanetS survey of Hαprotoplanets. In 2023 and 2024, our sharp (25–27 mas FWHM), well-AO-corrected (20%–26% Strehl), deep (2–3.6 hr) images detected compact (r∼ 30 mas;r∼ 3 au) circumplanetary disks (CPDs) surrounding both protoplanets. Starlight scattering off the front edge of these dusty CPDs is the likely source of the bright compact continuum light detected within ∼30 mas of both planets in our simultaneously obtained continuum 668 nm filter images. After subtraction of contaminating continuum and point-spread function residuals withpyKLIPangular differential imaging and spectral differential imaging, we obtained high-contrast ASDI Hαimages of both planets in 2022, 2023, and 2024. We find the Hαline flux of planet b fell by (8.1 ± 1.6) × 10−16erg s−1cm−2, a factor of 4.6 drop in flux from 2022 to 2023. In 2024 March, planet b continued to be faint with just a slight 1.6× rise to an Hαline flux of (3.64 ± 0.87) × 10−16erg s−1cm−2. For c, we measure a significant increase of (2.74 ± 0.51) × 10−16erg s−1cm−2from 2023 to 2024, which is a factor of 2.3 increase. So both protoplanets have recently experienced significant Hαvariability with ∼1 yr sampling. In 2024, planet c is brighter than b: as c is brightening and b generally fading. We also tentatively detect one new point source “CC3” inside the inner disk (∼49 mas; at PA ∼ 295°; 2024) with orbital motion roughly consistent with a ∼5.6 au orbit. 
    more » « less
  5. Abstract Although, neurosensory systems might have evolved independently in ctenophores, very little is known about their organization and functions. Most ctenophores are pelagic and deep‐water species and cannot be bred in the laboratory. Thus, it is not surprising that neuroanatomical data are available for only one genus within the group—Pleurobrachia. Here, using immunohistochemistry and scanning electron microscopy, we describe the organization of two distinct neural subsystems (subepithelial and mesogleal) and the structure of different receptor types in the comb jellyBeroe abyssicola—the voracious predator from North Pacific. A complex subepithelial neural network ofBeroe, with five receptor types, covers the entire body surface and expands deep into the pharynx. Three types of mesogleal neurons are comparable to the cydippidPleurobrachia. The predatory lifestyle ofBeroeis supported by the extensive development of ciliated and muscular structures including the presence of giant muscles and feeding macrocilia. The obtained cell‐type atlas illustrates different examples of lineage‐specific innovations within these enigmatic marine animals and reveals the remarkable complexity of sensory and effector systems in this clade of basal Metazoa. 
    more » « less