skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 10:00 PM to 12:00 AM ET on Tuesday, March 25 due to maintenance. We apologize for the inconvenience.


This content will become publicly available on September 12, 2025

Title: Mechanics Cognitive Diagnostic: Mathematics skills tested in introductory physics courses
Physics instructors and education researchers use research-based assessments (RBAs) to evaluate students' preparation for physics courses. This preparation can cover a wide range of constructs including mathematics and physics content. Using separate mathematics and physics RBAs consumes course time. We are developing a new RBA for introductory mechanics as an online test using both computerized adaptive testing and cognitive diagnostic models. This design allows the adaptive RBA to assess mathematics and physics content knowledge within a single assessment. In this article, we used an evidence-centered design framework to inform the extent to which our models of skills students develop in physics courses fit the data from three mathematics RBAs. Our dataset came from the LASSO platform and includes 3,491 responses from the Calculus Concept Assessment, Calculus Concept Inventory, and Pre-calculus Concept Assessment. Our model included five skills: apply vectors, conceptual relationships, algebra, visualizations, and calculus. The "deterministic inputs, noisy 'and' gate'' (DINA) analyses demonstrated a good fit for the five skills. The classification accuracies for the skills were satisfactory. Including items from the three mathematics RBAs in the item bank for the adaptive RBA will provide a flexible assessment of these skills across mathematics and physics content areas that can adapt to instructors' needs.  more » « less
Award ID(s):
2141847
PAR ID:
10544476
Author(s) / Creator(s):
; ; ; ; ; ;
Publisher / Repository:
American Association of Physics Teachers
Date Published:
Page Range / eLocation ID:
243 to 249
Format(s):
Medium: X
Location:
Boston, MA
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  2. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  3. Increased retention of students in STEM disciplines is one vital part to increase the number of students who are able to succeed in STEM fields. Our efforts focus on the first year Calculus and Physics curriculum, and we explicitly bring these two closely connected areas together. The program has two parts. First is the identification of students who can benefit the most from a coordinated experience. This is done by identifying students strength in mathematics as well as physics prior to entry into the courses, and our focus is on students whose have a relative strength in physics and a relative weakness in mathematics skills. The second step is to bring the two courses together into a vibrant whole. This is done by making use of fundamental physics models, and our students are challenged to increase their analytic abilities through the development of physics based models and an exploration and analysis of the resulting models. 
    more » « less
  4. Abstract Despite rapid growth of quantum information science (QIS) workforce development initiatives, perceived lack of agreement among faculty on core content has made prior research-based curriculum and assessment development initiatives difficult to scale. To identify areas of consensus on content coverage, we report findings from a survey of N=63 instructors teaching introductory QIS courses at US institutions of higher learning. We identify a subset of content items common across a large fraction (≥ 80%) of introductory QIS courses that are potentially amenable to research-based curriculum development, with an emphasis on foundational skills in mathematics, physics, and engineering. As a further guide for curriculum development, we also examine differences in content coverage by level (undergraduate/graduate) and discipline. Finally, we briefly discuss the implications of our findings for the development of a research-based QIS assessment at the postsecondary level. 
    more » « less
  5. This NSF-IUSE exploration and design project began in fall 2018 and features cross-disciplinary collaboration between engineering, math, and psychology faculty to develop learning activities with hands-on models and manipulatives. We are exploring how best to design these activities to support learners’ development of conceptual understanding and representational competence in integral calculus and engineering statics, two foundational courses for most engineering majors. A second goal is to leverage the model-based activities to scaffold spatial skills development in the context of traditional course content. As widely reported in the literature, well-developed spatial abilities correlate with student success and persistence in many STEM majors. We provided calculus students in selected intervention sections taught by four instructors at three different community colleges with take-home model kits that they could reference for a series of asynchronous learning activities. Students in these sections completed the Purdue Spatial Visualization Test: Rotations (PSVT:R) in the first and last weeks of their course. We also administered the assessment in multiple control sections (no manipulatives) taught by the same faculty. This paper analyzes results from fall 2020 through fall 2021 to see if there is any difference between control and intervention sections for the courses as a whole and for demographic subgroups including female-identifying students and historically-underserved students of color. All courses were asynchronous online modality in the context of the COVID-19 pandemic. We find that students in intervention sections of calculus made slightly larger gains on the PSVT:R, but this result is not statistically significant as a whole or for any of the demographic subgroups considered. We also analyzed final course grades for differences between control and intervention sections and found no differences. We found no significant effect of the presence of the model-based activities leading to increased PSVT:R gains or improved course grades. We would not extend this conclusion to face-to-face implementation, however, due primarily to the compromises made to adapt the curriculum from in-person group learning to asynchronous individual work and inconsistent engagement of the online students with the modeling activities. 
    more » « less