Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
This paper explores the assumptions that citizen science (CS) project leaders had about their volunteers’ science inquiry skill–proficiency overall, and then examines volunteers’ actual proficiency in one specific skill, scientific observation, because it is fundamental to and shared by many projects. This work shares findings from interviews with 10 project leaders related to two common assumptions leaders have about their volunteers’ skill proficiency: one, that volunteers can perform the necessary skills to participate at the start of a CS project, and therefore may not need training; and two, volunteer skill proficiency improves over time through involvement in the CS project. In order to answer questions about the degree of accuracy to which volunteers can perform the necessary skills and about differences in their skill proficiency based on experience and data collection procedures, we analyzed data from seven CS projects that used two shared embedded assessment tools, each focused on skills within the context of scientific observation in natural settings: Notice relevant features for taxonomic identification and record standard observations. This across-project and cross-sectional study found that the majority of citizen science volunteers (n = 176) had the necessary skill proficiency to collect accurate scientific observations but proficiency varied based on volunteer experience and project data collection procedures.more » « less
-
This paper describes the collaborative process for how a group of citizen science project leaders, evaluators, and researchers worked together to develop, validate, and test embedded assessments of two different volunteer science inquiry skills. The development process for creating these embedded assessments (activities integrated into the learning experience, allowing learners to demonstrate competencies) is articulated, as well as challenges encountered in assessing two science inquiry skills common in citizen science projects: notice relevant features and record standard observations. The authors investigate the extent to which the assessments were successful at achieving four criteria identified as ideal for shared embedded assessments of volunteers’ skills, namely: broadly applicable, authentic, performance-based, and integrated.more » « less
-
This paper is the culmination of several facilitated exercises and meetings between external researchers and five citizen science (CS) project teams who analyzed existing data records to understand CS volunteers’ accuracy and skills. CS teams identified a wide range of skill variables that were “hiding in plain sight” in their data records, and that could be explored as part of a secondary analysis, which we define here as analyses based on data already possessed by the project. Each team identified a small number of evaluation questions to explore with their existing data. Analyses focused on accurate data collection and all teams chose to add complementary records that documented volunteers’ project engagement or the data collection context to their analysis. Most analyses were conducted as planned, and included a range of approaches from correlation analyses to general additive models. Importantly, the results from these analyses were then used to inform the design of both existing and new CS projects, and to inform the field more broadly through a range of dissemination strategies. We conclude by sharing ways that others might consider pursuing their own secondary analysis to help fill gaps in our current understanding related to volunteer skills.more » « less
An official website of the United States government
