skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Annotation for transparent inquiry: Transparent data and analysis for qualitative research
How can authors using many individual pieces of qualitative data throughout a publication make their research transparent? In this paper we introduce Annotation for Transparent Inquiry (ATI), an approach to enhance transparency in qualitative research. ATI allows authors to connect specific passages in their publication with an annotation. These annotations provide additional information relevant to the passage and, when possible, include a link to one or more data sources underlying a claim; data sources are housed in a repository. After describing ATI’s conceptual and technological implementation, we report on its evaluation through a series of workshops conducted by the Qualitative Data Repository (QDR) and present initial results of the evaluation. The article ends with an outlook on next steps for the project.  more » « less
Award ID(s):
1823950
PAR ID:
10140037
Author(s) / Creator(s):
;
Date Published:
Journal Name:
IASSIST Quarterly
Volume:
43
Issue:
2
ISSN:
0739-1137
Page Range / eLocation ID:
1 to 9
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The Transparent Research Object Vocabulary (TROV) is a key element of the Transparency Certified (TRACE) approach to ensuring research trustworthiness. In contrast with methods that entail repeating computations in part or in full to verify that the descriptions of methods included in a publication are sufficient to reproduce reported results, the TRACE approach depends on a controlled computing environment termed a Transparent Research System (TRS) to guarantee that accurate, sufficiently complete, and otherwise trustworthy records are captured when results are obtained in the first place. Records identifying (1) the digital artifacts and computations that yielded a research result, (2) the TRS that witnessed the artifacts and supervised the computations, and (3) the specific conditions enforced by the TRS that warrant trust in these records, together constitute a Transparent Research Object (TRO). Digital signatures provided by the TRS and by a trusted third-party timestamp authority (TSA) guarantee the integrity and authenticity of the TRO. The controlled vocabulary TROV provides means to declare and query the properties of a TRO, to enumerate the dimensions of trustworthiness the TRS asserts for a TRO, and to verify that each such assertion is warranted by the documented capabilities of the TRS. Our approach for describing, publishing, and working with TROs imposes no restrictions on how computational artifacts are packaged or otherwise shared, and aims to be interoperable with, rather than to replace, current and future Research Object standards, archival formats, and repository layouts. 
    more » « less
  2. The National Science Foundation’s Arctic Data Center is the primary data repository for NSF-funded research conducted in the Arctic. There are major challenges in discovering and interpreting resources in a repository containing data as heterogeneous and interdisciplinary as those in the Arctic Data Center. This paper reports on advances in cyberinfrastructure at the Arctic Data Center that help address these issues by leveraging semantic technologies that enhance the repository’s adherence to the FAIR data principles and improve the Findability, Accessibility, Interoperability, and Reusability of digital resources in the repository. We describe the Arctic Data Center’s improvements. We use semantic annotation to bind metadata about Arctic data sets with concepts in web-accessible ontologies. The Arctic Data Center’s implementation of a semantic annotation mechanism is accompanied by the development of an extended search interface that increases the findability of data by allowing users to search for specific, broader, and narrower meanings of measurement descriptions, as well as through their potential synonyms. Based on research carried out by the DataONE project, we evaluated the potential impact of this approach, regarding the accessibility, interoperability, and reusability of measurement data. Arctic research often benefits from having additional data, typically from multiple, heterogeneous sources, that complement and extend the bases – spatially, temporally, or thematically – for understanding Arctic phenomena. These relevant data resources must be 'found', and 'harmonized' prior to integration and analysis. The findings of a case study indicated that the semantic annotation of measurement data enhances the capabilities of researchers to accomplish these tasks. 
    more » « less
  3. Accessibility of research data to disabled users has received scant attention in literature and practice. In this paper we briefly survey the current state of accessibility for research data and suggest some first steps that repositories should take to make their holdings more accessible. We then describe in depth how those steps were implemented at the Qualitative Data Repository (QDR), a domain repository for qualitative social-science data. The paper discusses accessibility testing and improvements on the repository and its underlying software, changes to the curation process to improve accessibility, as well as efforts to retroactively improve the accessibility of existing collections. We conclude by describing key lessons learned during this process as well as next steps. 
    more » « less
  4. The discipline of political science has been engaged in vibrant debate about research transparency for more than three decades. Over the last ten years, scholars who generate, collect, interpret, and analyze qualitative data have become increasingly involved in these discussions. The debate has played out across conference panels, coordinated efforts such as the Qualitative Transparency Deliberations (Büthe et al. 2021), articles in a range of journals, and symposia in outlets such as PS: Political Science and Politics, Security Studies, the newsletter of the Comparative Politics section of the American Political Science Association (APSA), and, indeed, QMMR. Until recently, much of the dialogue has been conducted in the abstract. Scholars have thoroughly considered the questions of whether political scientists who generate and employ qualitative data and methods can and should seek to make their work more transparent, what information they should share about data generation and analysis, and which (if any) data they should make accessible in pursuit of transparency. 
    more » « less
  5. Scientists who perform major survival surgery on laboratory animals face a dual welfare and methodological challenge: how to choose surgical anesthetics and post-operative analgesics that will best control animal suffering, knowing that both pain and the drugs that manage pain can all affect research outcomes. Scientists who publish full descriptions of animal procedures allow critical and systematic reviews of data, demonstrate their adherence to animal welfare norms, and guide other scientists on how to conduct their own studies in the field. We investigated what information on animal pain management a reasonably diligent scientist might find in planning for a successful experiment. To explore how scientists in a range of fields describe their management of this ethical and methodological concern, we scored 400 scientific articles that included major animal survival surgeries as part of their experimental methods, for the completeness of information on anesthesia and analgesia. The 400 articles (250 accepted for publication pre-2011, and 150 in 2014–15, along with 174 articles they reference) included thoracotomies, craniotomies, gonadectomies, organ transplants, peripheral nerve injuries, spinal laminectomies and orthopedic procedures in dogs, primates, swine, mice, rats and other rodents. We scored articles for Publication Completeness (PC), which was any mention of use of anesthetics or analgesics; Analgesia Use (AU) which was any use of post-surgical analgesics, and Analgesia Completeness (a composite score comprising intra-operative analgesia, extended post-surgical analgesia, and use of multimodal analgesia). 338 of 400 articles were PC. 98 of these 338 were AU, with some mention of analgesia, while 240 of 338 mentioned anesthesia only but not postsurgical analgesia. Journals’ caliber, as measured by their 2013 Impact Factor, had no effect on PC or AU. We found no effect of whether a journal instructs authors to consult the ARRIVE publishing guidelines published in 2010 on PC or AC for the 150 mouse and rat articles in our 2014–15 dataset. None of the 302 articles that were silent about analgesic use included an explicit statement that analgesics were withheld, or a discussion of how pain management or untreated pain might affect results. We conclude that current scientific literature cannot be trusted to present full detail on use of animal anesthetics and analgesics. We report that publication guidelines focus more on other potential sources of bias in experimental results, under-appreciate the potential for pain and pain drugs to skew data, PLOS ONE | DOI:10.1371/journal.pone.0155001 May 12, 2016 1 / 24 a11111 OPEN ACCESS Citation: Carbone L, Austin J (2016) Pain and Laboratory Animals: Publication Practices for Better Data Reproducibility and Better Animal Welfare. PLoS ONE 11(5): e0155001. doi:10.1371/journal. pone.0155001 Editor: Chang-Qing Gao, Central South University, CHINA Received: December 29, 2015 Accepted: April 22, 2016 Published: May 12, 2016 Copyright: © 2016 Carbone, Austin. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Data Availability Statement: All relevant data are within the paper and its Supporting Information files. Authors may be contacted for further information. Funding: This study was funded by the United States National Science Foundation Division of Social and Economic Sciences. Award #1455838. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Competing Interests: The authors have declared that no competing interests exist. and thus mostly treat pain management as solely an animal welfare concern, in the jurisdiction of animal care and use committees. At the same time, animal welfare regulations do not include guidance on publishing animal data, even though publication is an integral part of the cycle of research and can affect the welfare of animals in studies building on published work, leaving it to journals and authors to voluntarily decide what details of animal use to publish. We suggest that journals, scientists and animal welfare regulators should revise current guidelines and regulations, on treatment of pain and on transparent reporting of treatment of pain, to improve this dual welfare and data-quality deficiency. 
    more » « less