This study investigates differences in student participation rates between in-class and online administrations of research-based assessments. A sample of 1,310 students from 25 sections of 3 different introductory physics courses over two semesters were instructed to complete the CLASS attitudinal survey and the concept inventory relevant to their course, either the FCI or the CSEM. Each student was randomly assigned to take one of the surveys in class and the other survey online at home using the Learning About STEM Student Outcomes (LASSO) platform. Results indicate large variations in participation rates across both test conditions (online and in class). A hierarchical generalized linear model (HGLM) of the student data utilizing logistic regression indicates that student grades in the course and faculty assessment administration practices were both significant predictors of student participation. When the recommended online assessments administration practices were implemented, participation rates were similar across test conditions. Implications for student and course assessment methodologies will be discussed.
more »
« less
In-class vs. Online Administration of Concept Inventories and Attitudinal Assessments
This study investigates differences in student responses to in-class and online administrations of the Force Concept Inventory (FCI), Conceptual Survey of Electricity and Magnetism (CSEM), and the Colorado Learning Attitudes about Science Survey (CLASS). Close to 700 physics students from 12 sections of three different courses were instructed to complete the concept inventory relevant to their course, either the FCI or CSEM, and the CLASS. Each student was randomly assigned to take one of the surveys in class and the other survey online using the LA Supported Student Outcomes (LASSO) system hosted by the Learning Assistant Alliance (LAA). We examine how testing environments and instructor practices affect participation rates and identify best practices for future use.
more »
« less
- Award ID(s):
- 1525338
- NSF-PAR ID:
- 10099990
- Date Published:
- Journal Name:
- Proc. 2016 Physics Education Research Conference
- Page Range / eLocation ID:
- 176 to 179
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
This study investigated whether and how Learning Assistant (LA) support is linked to student outcomes in Physics courses nationwide. Paired student concept inventory scores were collected over three semesters from 3,753 students, representing 69 courses, and 40 instructors, from 17 LA Alliance member institutions. Each participating student completed an online concept inventory at the beginning (pre) and end (post) of each term. The physics concept inventories tested included the Force Concept Inventory (FCI), Conceptual Survey of Electricity and Magnetism (CSEM), Force and Motion Concept Evaluation (FMCE) and the Brief Electricity and Magnetism Assessment (BEMA). Across instruments, Cohen’s d effect sizes were 1.4 times higher, on average, for courses supported by LAs compared to courses without LA support. Preliminary findings indicate that physics students' outcomes may be most effective when LA support is utilized in laboratory settings (1.9 times higher than no LA support) in comparison to lecture (1.4 times higher), recitations (1.5 times higher), or unknown uses (1.3 times higher). Additional research will inform LA-implementation best practices across disciplines.more » « less
-
null (Ed.)Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording.more » « less
-
null (Ed.)Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording.more » « less
-
The Learning Assistant (LA) model supports instructors in implementing research-based teaching practices in their own courses. In the LA model undergraduate students are hired to help facilitate research-based collaborative-learning activities. Using the Learning About STEM Student Out- comes (LASSO) database, we examined student learning from 112 first-semester physics courses that used either lecture-based instruction, collaborative instruction without LAs, or LA supported instruction. We measured student learning using 5959 students’ responses on the Force and Motion Conceptual Evaluation (FMCE) or Force Concept Inventory (FCI). Results from Hierarchical Linear Models (HLM) indicated that LA supported courses had higher posttest scores than collaborative courses without LAs and that LA supported courses that used LAs in laboratory and recitation had higher posttest scores than those that used LAs in lecture.more » « less