Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
The Learning Assistant (LA) model supports instructors in implementing research-based teaching practices in their own courses. In the LA model undergraduate students are hired to help facilitate research-based collaborative-learning activities. Using the Learning About STEM Student Out- comes (LASSO) database, we examined student learning from 112 first-semester physics courses that used either lecture-based instruction, collaborative instruction without LAs, or LA supported instruction. We measured student learning using 5959 students’ responses on the Force and Motion Conceptual Evaluation (FMCE) or Force Concept Inventory (FCI). Results from Hierarchical Linear Models (HLM) indicated that LA supported courses had higher posttest scores than collaborative courses without LAs and that LA supported courses that used LAs in laboratory and recitation had higher posttest scores than those that used LAs in lecture.more » « less
-
Research-based assessments (RBAs), such as the Force Concept Inventory, have played central roles in many course transformations from traditional lecture-based instruction to research-based teaching methods. In order to support instructors in assessing their courses, the online Learning About STEM Student Outcomes (LASSO) platform simplifies administering, scoring, and interpreting RBAs. Reducing the barriers to using RBAs will support more instructors in objectively assessing the efficacy of their courses and, subsequently, transforming their courses to improve student outcomes. The purpose of this study was to investigate the extent to which RBAs administered online and outside of class with the LASSO platform provided equivalent data to tradi- tional paper and pencil tests administered in class. Research indicates that these two modes of administering assessments provide equivalent data for graded exams that are administered in class. However, little research has focused on ungraded (low-stakes) exams that are administered outside of class. We used an experimental design to investigate the differences between these two test modes. Results indicated that the LASSO platform provided equivalent data to paper and pencil tests.more » « less
-
This study investigates differences in student participation rates between in-class and online administrations of research-based assessments. A sample of 1,310 students from 25 sections of 3 different introductory physics courses over two semesters were instructed to complete the CLASS attitudinal survey and the concept inventory relevant to their course, either the FCI or the CSEM. Each student was randomly assigned to take one of the surveys in class and the other survey online at home using the Learning About STEM Student Outcomes (LASSO) platform. Results indicate large variations in participation rates across both test conditions (online and in class). A hierarchical generalized linear model (HGLM) of the student data utilizing logistic regression indicates that student grades in the course and faculty assessment administration practices were both significant predictors of student participation. When the recommended online assessments administration practices were implemented, participation rates were similar across test conditions. Implications for student and course assessment methodologies will be discussed.more » « less