This study investigates differences in student responses to in-class and online administrations of the Force Concept Inventory (FCI), Conceptual Survey of Electricity and Magnetism (CSEM), and the Colorado Learning Attitudes about Science Survey (CLASS). Close to 700 physics students from 12 sections of three different courses were instructed to complete the concept inventory relevant to their course, either the FCI or CSEM, and the CLASS. Each student was randomly assigned to take one of the surveys in class and the other survey online using the LA Supported Student Outcomes (LASSO) system hosted by the Learning Assistant Alliance (LAA). We examine how testing environments and instructor practices affect participation rates and identify best practices for future use.
more »
« less
Participation rates of in-class vs. online administration of low-stakes research-based assessments
This study investigates differences in student participation rates between in-class and online administrations of research-based assessments. A sample of 1,310 students from 25 sections of 3 different introductory physics courses over two semesters were instructed to complete the CLASS attitudinal survey and the concept inventory relevant to their course, either the FCI or the CSEM. Each student was randomly assigned to take one of the surveys in class and the other survey online at home using the Learning About STEM Student Outcomes (LASSO) platform. Results indicate large variations in participation rates across both test conditions (online and in class). A hierarchical generalized linear model (HGLM) of the student data utilizing logistic regression indicates that student grades in the course and faculty assessment administration practices were both significant predictors of student participation. When the recommended online assessments administration practices were implemented, participation rates were similar across test conditions. Implications for student and course assessment methodologies will be discussed.
more »
« less
- Award ID(s):
- 1525338
- PAR ID:
- 10099984
- Date Published:
- Journal Name:
- Proc. 2017 Physics Education Research Conference
- Page Range / eLocation ID:
- 196 to 199
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording.more » « less
-
null (Ed.)Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording.more » « less
-
This study investigates stopout patterns in MOOCs to understand course and assessment-level factors that influence student stopout behavior. We expanded previous work on stopout by assessing the exponential decay of assessment-level stopout rates across courses. Results confirm a disproportionate stopout rate on the first graded assessment. We then evaluated which course and assessment level features were associated with stopout on the first assessment. Findings suggest that a higher number of questions and estimated time commitment in the early assessments and more assessments in a course may be associated with a higher proportion of early stopout behavior.more » « less
-
null (Ed.)This research paper examines the influence of interpersonal interactions on the course-level persistence intentions of online undergraduate engineering students. Online learning is increasing in enrollment and importance in engineering education. Online courses also continue to confront issues with comparatively higher course dropout levels than face-to-face courses. This study correspondingly explores relevant student perceptions of their online course experiences to better understand the factors that contribute to students’ choices to remain in or drop out of their online undergraduate engineering courses. Data presented in this study were collected during fall 2019 and spring 2020 from three ABET-accredited online undergraduate engineering courses at a large southwestern public university: electrical engineering, engineering management, and software engineering. Participants were asked to respond to surveys at 12-time points during their 7.5-week online course. Each survey measured students’ perceptions of course LMS dialog, perceptions of instructor practices, and peer support for completing the course. Participants also reported their intentions to persist in the course during each survey administration. A multi-level modeling analysis revealed that LMS dialog, perceptions of instructor practices, and peer support are related to course persistence intentions. Time was also a significant predictor of persistence intentions and indicated that the course persistence intentions decrease towards the end of the course. Additionally, interactions between demographic variables and other predictors (perceptions of course LMS dialog, perceptions of instructor practices, and perceptions of peer support) were significant. With the increase in perceptions of course LMS dialog, perceptions of instructor practices, and perceptions of peer support, there was a relatively smaller increase in the persistence intentions of veterans than non-veterans. There is relatively more increase in the persistence intentions of females than males as their perceptions of instructor practices increase. Finally, increasing perceptions of peer support led to a relatively larger increase in the persistence intentions of non-transfer students than transfer students and a relatively smaller increase in persistence intentions of students working full-time than other students.more » « less
An official website of the United States government

