skip to main content


Title: Providing Context for Identifying Effective Introductory Mechanics Courses
Research-based assessments (RBAs) measure how well a course achieves discipline-specific outcomes. Educators can use outcomes from RBAs to guide instructional choices and to request resources to implement and sustain instructional transformations. One challenge for using RBAs, however, is a lack of comparative data, particularly given the skew in the research literature toward calculus-based courses at highly selective institutions. In this article, we provide a large-scale dataset and several tools educators in introductory physics courses can use to inform how well their courses foster student conceptual understanding of Newtonian physics. The supplemental materials include this dataset and these tools. Educators and administrators will often target courses with high drop, withdrawal, and failure rates for transformations to student-centered instructional strategies. RBAs and the comparative tools presented herein allow educators to address critiques that the course transformations made the courses “easier” by showing that the transformed course supported physics learning compared to similar courses at other institutions. Educators can also use the tools to track course efficacy over time.  more » « less
Award ID(s):
1928596
PAR ID:
10327865
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
The Physics Teacher
Volume:
60
Issue:
3
ISSN:
0031-921X
Page Range / eLocation ID:
179 to 182
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  2. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  3. Technology can assist instructional designers and teachers in meeting the needs of learners in traditional classrooms and virtual course environments. During the COVID-19 pandemic, many teachers and instructional designers began looking for resources they could use for hybrid and online course delivery. Many found that the cost of some technology tools was well outside of their financial means to assist them in meeting student learning outcomes. However, some digital tools provide free access for educators and are beneficial to students. In this article, the authors shared five tools they have used in developing and teaching online and traditional technology courses at the college level. They share how they used a learning management system tool, a collaboration tool, a search engine tool, a content creation tool, and a content sharing tool to engage students in their courses. As teachers look for alternatives to use as they move content from classroom teaching to online instruction, this article can help them consider the recommended tools for instruction. Teachers, instructors, and instructional designers may explore the free digital tools in this article and do further research on other digital tools to support student learning in their disciplines. 
    more » « less
  4. Research-based assessments (RBAs), such as the Force Concept Inventory, have played central roles in many course transformations from traditional lecture-based instruction to research-based teaching methods. In order to support instructors in assessing their courses, the online Learning About STEM Student Outcomes (LASSO) platform simplifies administering, scoring, and interpreting RBAs. Reducing the barriers to using RBAs will support more instructors in objectively assessing the efficacy of their courses and, subsequently, transforming their courses to improve student outcomes. The purpose of this study was to investigate the extent to which RBAs administered online and outside of class with the LASSO platform provided equivalent data to tradi- tional paper and pencil tests administered in class. Research indicates that these two modes of administering assessments provide equivalent data for graded exams that are administered in class. However, little research has focused on ungraded (low-stakes) exams that are administered outside of class. We used an experimental design to investigate the differences between these two test modes. Results indicated that the LASSO platform provided equivalent data to paper and pencil tests. 
    more » « less
  5. Physics instructors and education researchers use research-based assessments (RBAs) to evaluate students' preparation for physics courses. This preparation can cover a wide range of constructs including mathematics and physics content. Using separate mathematics and physics RBAs consumes course time. We are developing a new RBA for introductory mechanics as an online test using both computerized adaptive testing and cognitive diagnostic models. This design allows the adaptive RBA to assess mathematics and physics content knowledge within a single assessment. In this article, we used an evidence-centered design framework to inform the extent to which our models of skills students develop in physics courses fit the data from three mathematics RBAs. Our dataset came from the LASSO platform and includes 3,491 responses from the Calculus Concept Assessment, Calculus Concept Inventory, and Pre-calculus Concept Assessment. Our model included five skills: apply vectors, conceptual relationships, algebra, visualizations, and calculus. The "deterministic inputs, noisy 'and' gate'' (DINA) analyses demonstrated a good fit for the five skills. The classification accuracies for the skills were satisfactory. Including items from the three mathematics RBAs in the item bank for the adaptive RBA will provide a flexible assessment of these skills across mathematics and physics content areas that can adapt to instructors' needs. 
    more » « less