Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
We document aggregations of an undescribed benthic solitary tunicate of the family Pyuridae from the Arabian Sea. This new genus was found forming dense thickets in shallow rocky substrates around Masirah Island and the Dhofar area in Oman. Such aggregations of tunicates have not been reported before from coral reefs in the Indo-West Pacific region and the Atlantic. This observation contributes to our understanding of the ecology and biogeography of ascidians, setting the stage for a comprehensive species description and in-depth analysis of this species.more » « less
-
Abstract The mutualism between clownfishes (or anemonefishes) and their giant host sea anemones are among the most immediately recognizable animal interactions on the planet and have attracted a great deal of popular and scientific attention [1-5]. However, our evolutionary understanding of this iconic symbiosis comes almost entirely from studies on clownfishes— a charismatic group of 28 described species in the genusAmphiprion[2]. Adaptation to venomous sea anemones (Anthozoa: Actiniaria) provided clownfishes with novel habitat space, ultimately triggering the adaptive radiation of the group [2]. Clownfishes diverged from their free-living ancestors 25-30 MYA with their adaptive radiation to sea anemones dating to 13.2 MYA [2, 3]. Far from being mere habitat space, the host sea anemones also receive substantial benefits from hosting clownfishes, making the mutualistic and co-dependent nature of the symbiosis well established [4, 5]. Yet the evolutionary consequences of mutualism with clownfishes have remained a mystery from the host perspective. Here we use bait-capture sequencing to fully resolve the evolutionary relationships among the 10 nominal species of clownfish-hosting sea anemones for the first time (Figure 1). Using time-calibrated divergence dating analyses we calculate divergence times of less than 25 MYA for each host species, with 9 of 10 host species having divergence times within the last 13 MYA (Figure 1). The clownfish-hosting sea anemones thus diversified coincidently with clownfishes, potentially facilitating the clownfish adaptive radiation, and providing the first strong evidence for co-evolutionary patterns in this iconic partnership.more » « less
-
Abstract Environmental DNA (eDNA) data make it possible to measure and monitor biodiversity at unprecedented resolution and scale. As use‐cases multiply and scientific consensus grows regarding the value of eDNA analysis, public agencies have an opportunity to decide how and where eDNA data fit into their mandates. Within the United States, many federal and state agencies are individually using eDNA data in various applications and developing relevant scientific expertise. A national strategy for eDNA implementation would capitalize on recent scientific developments, providing a common set of next‐generation tools for natural resource management and public health protection. Such a strategy would avoid patchwork and possibly inconsistent guidelines in different agencies, smoothing the way for efficient uptake of eDNA data in management. Because eDNA analysis is already in widespread use in both ocean and freshwater settings, we focus here on applications in these environments. However, we foresee the broad adoption of eDNA analysis to meet many resource management issues across the nation because the same tools have immediate terrestrial and aerial applications.more » « less
-
Internet of Samples (iSamples): Toward an interdisciplinary cyberinfrastructure for material samplesnull (Ed.)Abstract Sampling the natural world and built environment underpins much of science, yet systems for managing material samples and associated (meta)data are fragmented across institutional catalogs, practices for identification, and discipline-specific (meta)data standards. The Internet of Samples (iSamples) is a standards-based collaboration to uniquely, consistently, and conveniently identify material samples, record core metadata about them, and link them to other samples, data, and research products. iSamples extends existing resources and best practices in data stewardship to render a cross-domain cyberinfrastructure that enables transdisciplinary research, discovery, and reuse of material samples in 21st century natural science.more » « less
-
Coral reefs are declining worldwide primarily because of bleaching and subsequent mortality resulting from thermal stress. Currently, extensive efforts to engage in more holistic research and restoration endeavors have considerably expanded the techniques applied to examine coral samples. Despite such advances, coral bleaching and restoration studies are often conducted within a specific disciplinary focus, where specimens are collected, preserved, and archived in ways that are not always conducive to further downstream analyses by specialists in other disciplines. This approach may prevent the full utilization of unexpended specimens, leading to siloed research, duplicative efforts, unnecessary loss of additional corals to research endeavors, and overall increased costs. A recent US National Science Foundation-sponsored workshop set out to consolidate our collective knowledge across the disciplines of Omics, Physiology, and Microscopy and Imaging regarding the methods used for coral sample collection, preservation, and archiving. Here, we highlight knowledge gaps and propose some simple steps for collecting, preserving, and archiving coral-bleaching specimens that can increase the impact of individual coral bleaching and restoration studies, as well as foster additional analyses and future discoveries through collaboration. Rapid freezing of samples in liquid nitrogen or placing at −80 °C to −20 °C is optimal for most Omics and Physiology studies with a few exceptions; however, freezing samples removes the potential for many Microscopy and Imaging-based analyses due to the alteration of tissue integrity during freezing. For Microscopy and Imaging, samples are best stored in aldehydes. The use of sterile gloves and receptacles during collection supports the downstream analysis of host-associated bacterial and viral communities which are particularly germane to disease and restoration efforts. Across all disciplines, the use of aseptic techniques during collection, preservation, and archiving maximizes the research potential of coral specimens and allows for the greatest number of possible downstream analyses.more » « less
-
Abstract Genetic data represent a relatively new frontier for our understanding of global biodiversity. Ideally, such data should include both organismal DNA‐based genotypes and the ecological context where the organisms were sampled. Yet most tools and standards for data deposition focus exclusively either on genetic or ecological attributes. The Genomic Observatories Metadatabase (GEOME: geome‐db.org) provides an intuitive solution for maintaining links between genetic data sets stored by the International Nucleotide Sequence Database Collaboration (INSDC) and their associated ecological metadata. GEOME facilitates the deposition of raw genetic data to INSDCs sequence read archive (SRA) while maintaining persistent links to standards‐compliant ecological metadata held in the GEOME database. This approach facilitates findable, accessible, interoperable and reusable data archival practices. Moreover, GEOME enables data management solutions for large collaborative groups and expedites batch retrieval of genetic data from the SRA. The article that follows describes how GEOME can enable genuinely open data workflows for researchers in the field of molecular ecology.more » « less
-
Abstract The semiconductor tracker (SCT) is one of the tracking systems for charged particles in the ATLAS detector. It consists of 4088 silicon strip sensor modules.During Run 2 (2015–2018) the Large Hadron Collider delivered an integrated luminosity of 156 fb -1 to the ATLAS experiment at a centre-of-mass proton-proton collision energy of 13 TeV. The instantaneous luminosity and pile-up conditions were far in excess of those assumed in the original design of the SCT detector.Due to improvements to the data acquisition system, the SCT operated stably throughout Run 2.It was available for 99.9% of the integrated luminosity and achieved a data-quality efficiency of 99.85%.Detailed studies have been made of the leakage current in SCT modules and the evolution of the full depletion voltage, which are used to study the impact of radiation damage to the modules.more » « less