skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Applying cognitive diagnostic models to mechanics concept inventories
In physics education research, instructors and researchers often use research-based assessments (RBAs) to assess students’ skills and knowledge. In this paper, we support the development of a mechanics cognitive diagnostic to test and implement effective and equitable pedagogies for physics instruction. Adaptive assessments using cognitive diagnostic models provide significant advantages over fixed-length RBAs commonly used in physics education research. As part of a broader project to develop a cognitive diagnostic assessment for introductory mechanics within an evidence-centered design framework, we identified and tested the student models of four skills that cross content areas in introductory physics: apply vectors, conceptual relationships, algebra, and visualizations. We developed the student models in three steps. First, we based the model on learning objectives from instructors. Second, we coded the items on RBAs using the student models. Finally, we then tested and refined this coding using a common cognitive diagnostic model, the deterministic inputs, noisy “and” gate model. The data included 19 889 students who completed either the Force Concept Inventory, Force and Motion Conceptual Evaluation, or Energy and Momentum Conceptual Survey on the LASSO platform. The results indicated a good to adequate fit for the student models with high accuracies for classifying students with many of the skills. The items from these three RBAs do not cover all of the skills in enough detail, however, they will form a useful initial item bank for the development of the mechanics cognitive diagnostic.  more » « less
Award ID(s):
2141847 2322015 2142317
PAR ID:
10639651
Author(s) / Creator(s):
; ; ; ; ; ; ;
Publisher / Repository:
American Physical Society
Date Published:
Journal Name:
Physical Review Physics Education Research
Volume:
21
Issue:
1
ISSN:
2469-9896
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Physics instructors and education researchers use research-based assessments (RBAs) to evaluate students' preparation for physics courses. This preparation can cover a wide range of constructs including mathematics and physics content. Using separate mathematics and physics RBAs consumes course time. We are developing a new RBA for introductory mechanics as an online test using both computerized adaptive testing and cognitive diagnostic models. This design allows the adaptive RBA to assess mathematics and physics content knowledge within a single assessment. In this article, we used an evidence-centered design framework to inform the extent to which our models of skills students develop in physics courses fit the data from three mathematics RBAs. Our dataset came from the LASSO platform and includes 3,491 responses from the Calculus Concept Assessment, Calculus Concept Inventory, and Pre-calculus Concept Assessment. Our model included five skills: apply vectors, conceptual relationships, algebra, visualizations, and calculus. The "deterministic inputs, noisy 'and' gate'' (DINA) analyses demonstrated a good fit for the five skills. The classification accuracies for the skills were satisfactory. Including items from the three mathematics RBAs in the item bank for the adaptive RBA will provide a flexible assessment of these skills across mathematics and physics content areas that can adapt to instructors' needs. 
    more » « less
  2. Education researchers often compare performance across race and gender on research-based assessments of physics knowledge to investigate the impacts of racism and sexism on physics student learning. These investigations' claims rely on research-based assessments providing reliable, unbiased measures of student knowledge across social identity groups. We used classical test theory and differential item functioning (DIF) analysis to examine whether the items on the Force Concept Inventory (FCI) provided unbiased data across social identifiers for race, gender, and their intersections. The data was accessed through the Learning About STEM Student Outcomes platform and included responses from 4,848 students posttests in 152 calculus-based introductory physics courses from 16 institutions. The results indicated that the majority of items (22) on the FCI were biased towards a group. These results point to the need for instrument validation to account for item bias and the identification or development of fair research-based assessments. 
    more » « less
  3. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  4. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  5. Wessner, David R (Ed.)
    Visual models are a necessary part of molecular biology education because submicroscopic compounds and processes cannot be directly observed. Accurately interpreting the biological information conveyed by the shapes and symbols in these visual models requires engaging visual literacy skills. For students to develop expertise in molecular biology visual literacy, they need to have structured experiences using and creating visual models, but there is little evidence to gauge how often undergraduate biology students are provided such opportunities. To investigate students’ visual literacy experiences, we surveyed 66 instructors who taught lower division undergraduate biology courses with a focus on molecular biology concepts. We collected self-reported data about the frequency with which the instructors teach with visual models and we analyzed course exams to determine how instructors incorporated visual models into their assessments. We found that most instructors reported teaching with models in their courses, yet only 16% of exam items in the sample contained a visual model. There was not a statistically significant relationship between instructors’ self-reported frequency of teaching with models and extent to which their exams contained models, signaling a potential mismatch between teaching and assessment practices. Although exam items containing models have the potential to elicit higher-order cognitive skills through model-based reasoning, we found that when instructors included visual models in their exams the majority of the items only targeted the lower-order cognitive skills of Bloom’s Taxonomy. Together, our findings highlight that despite the importance of visual models in molecular biology, students may not often have opportunities to demonstrate their understanding of these models on assessments. 
    more » « less