This article illustrates and differentiates the unique role cognitive interviews and think-aloud interviews play in developing and validating assessments. Specifically, we describe the use of (a) cognitive interviews to gather empirical evidence to support claims about the intended construct being measured and (b) think-aloud interviews to gather evidence about the problem-solving processes students use while completing tasks assessing the intended construct. We illustrate their use in the context of a classroom assessment of an early mathematics construct – numeric relational reasoning – for kindergarten through Grade 2 students. This assessment is intended to provide teachers with data to guide their instructional decisions. We conducted 64 cognitive interviews with 32 students to collect evidence about students’ understanding of the construct. We conducted 106 think-aloud interviews with 14 students to understand how the prototypical items elicited the intended construct. The task-based interview results iteratively informed assessment development and contributed important sources of validity evidence.
more »
« less
Iterative Cognitive Interview Design to Uncover Children's Spatial Reasoning
Cognitive interviews play an important role in articulating the intended construct of educational assessments. This paper describes the iterative development of protocols for cognitive interviews with kindergarten through second-grade children to understand how their spatial reasoning skill development aligns with intended constructs. We describe the procedures used to gather evidence of construct relevance and improved alignment to task-based interview items through multiple pilot rounds before conducting cognitive interviews. We found improved alignment and reduced construct irrelevant variance after protocol revisions.
more »
« less
- Award ID(s):
- 1721100
- PAR ID:
- 10534040
- Publisher / Repository:
- University of Massachusetts Amherst Libraries
- Date Published:
- Journal Name:
- Practical assessment research evaluation
- ISSN:
- 1531-7714
- Subject(s) / Keyword(s):
- cognitive interview learning progression spatial reasoning construct irrelevant variance
- Format(s):
- Medium: X
- Right(s):
- All rights reserved
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract—This WIP research paper presents validity evidence for a survey instrument designed to assess student learning in makerspaces. We report findings from expert reviews of item content and student interpretations of survey questions. The instrument was developed using a theory-driven approach to define constructs, followed by the development of questions aligned with those constructs. We solicited written feedback from 30 experts in instrument development and/or makerspaces, who rated the alignment of items with our constructs. Based on this input, we revised our items for clarity and consistency. We then conducted 25 cognitive interviews with a diverse group of students who use makerspaces, asking them to explain their understanding of each item and the reasoning behind their responses. Our recruitment ensured diversity in terms of race, gender, ethnicity, and academic background, extending beyond engineering majors. From our initial 45 items, we removed 6, modified 36, and added 1 based on expert feedback. During cognitive interviews, we began with 40 items, deleted one, and revised 23, resulting in 39 items for the pilot survey. Key findings included the value of examples in clarifying broad terms and improved student engagement with a revised rating scale—shifting from a 7-point Likert agreement scale to a self-description format encouraged fuller use of the scale. Our study contributes to the growing body of research on makerspaces by offering insights into how students describe their learning experiences and by providing initial validation evidence for a tool to assess those experiences, ultimately strengthening the credibility of the instrument.more » « less
-
Describing how knowledge is used on interdisciplinary projects differs between cognitive scientists and academics. This study aims to explore how knowledge is categorized by practicing engineers in the context of an interdisciplinary engineering project through the use of phenomenological interviews with practicing engineers. Findings suggest that engineers classify knowledge based on the functional parts of systems and subsystems. While this method of classification has overlap with academics use of the construct “disciplines” and cognitive scientists’ use of the construct “domains,” dissimilar aspects could impact how knowledge is accessed and utilized in the future by students in engineering programsmore » « less
-
Abstract Analogies are used to make abstract topics meaningful and more easily comprehensible to learners. Incorporating simple analogies into STEM classrooms is a fairly common practice, but the analogies are typically generated and explained by the instructor for the learners. We hypothesize that challenging learners to create complex, extended analogies themselves can promote integration of content knowledge and development of critical thinking skills, which are essential for deep learning, but are challenging to teach. In this qualitative study, college biology students (n = 30) were asked to construct a complex analogy about the flow of genetic information using a familiar item. One week later, participants constructed a second analogy about the same topic, but this time using a more challenging item. Twenty participants worked on the challenging analogy in pairs, while the other 10 worked alone. Analysis of the 50 interviews resulted in a novel‐scoring scheme, which measured both content knowledge (understanding of biology terms) and critical thinking (alignment of relationships between elements of the analogy). Most participants improved slightly due to practice, but they improved dramatically when working with a partner. The biggest gains were seen in critical thinking, not content knowledge. Having students construct complex, sophisticated analogies in pairs is a high‐impact practice that can help students develop their critical thinking skills, which are crucial in academic and professional settings. The discussion between partners likely requires students to justify their explanations and critique their partner's explanations, which are characteristics of critical thinking.more » « less
-
null (Ed.)This Research Category - Full Paper presents initial emergent themes from our quest to understand the construct of intuition. Our work uses theories of expertise development and dual-cognitive processing frameworks to provide a theoretical grounding to define discipline-specific intuition. We hypothesize that intuition can be observed in disciplinary experts through discussions of experience and decision-making processes. Interviews were conducted with professionals in three fields - engineering, nursing, and business management - that engage intuition in decision-making. A comparative analysis of emergent themes is presented to understand similarities and differences in use and definition across these disciplines. Parallel grounded theory and critical incident technique approaches were used to identify perceptions and incidents of intuition. Results suggest that intuition can be defined as a "sense of knowing" that is context specific and at least partly attributable to experience. Inclusion of multiple fields and comparisons across disciplines form the foundation for our future work focusing solely on engineering intuition.more » « less
An official website of the United States government

