skip to main content


Title: Participation rates of in-class vs. online administration of low-stakes research-based assessments
This study investigates differences in student participation rates between in-class and online administrations of research-based assessments. A sample of 1,310 students from 25 sections of 3 different introductory physics courses over two semesters were instructed to complete the CLASS attitudinal survey and the concept inventory relevant to their course, either the FCI or the CSEM. Each student was randomly assigned to take one of the surveys in class and the other survey online at home using the Learning About STEM Student Outcomes (LASSO) platform. Results indicate large variations in participation rates across both test conditions (online and in class). A hierarchical generalized linear model (HGLM) of the student data utilizing logistic regression indicates that student grades in the course and faculty assessment administration practices were both significant predictors of student participation. When the recommended online assessments administration practices were implemented, participation rates were similar across test conditions. Implications for student and course assessment methodologies will be discussed.  more » « less
Award ID(s):
1525338
PAR ID:
10099984
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Proc. 2017 Physics Education Research Conference
Page Range / eLocation ID:
196 to 199
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This study investigates differences in student responses to in-class and online administrations of the Force Concept Inventory (FCI), Conceptual Survey of Electricity and Magnetism (CSEM), and the Colorado Learning Attitudes about Science Survey (CLASS). Close to 700 physics students from 12 sections of three different courses were instructed to complete the concept inventory relevant to their course, either the FCI or CSEM, and the CLASS. Each student was randomly assigned to take one of the surveys in class and the other survey online using the LA Supported Student Outcomes (LASSO) system hosted by the Learning Assistant Alliance (LAA). We examine how testing environments and instructor practices affect participation rates and identify best practices for future use. 
    more » « less
  2. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  3. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  4. null (Ed.)
    This research paper examines the influence of interpersonal interactions on the course-level persistence intentions of online undergraduate engineering students. Online learning is increasing in enrollment and importance in engineering education. Online courses also continue to confront issues with comparatively higher course dropout levels than face-to-face courses. This study correspondingly explores relevant student perceptions of their online course experiences to better understand the factors that contribute to students’ choices to remain in or drop out of their online undergraduate engineering courses. Data presented in this study were collected during fall 2019 and spring 2020 from three ABET-accredited online undergraduate engineering courses at a large southwestern public university: electrical engineering, engineering management, and software engineering. Participants were asked to respond to surveys at 12-time points during their 7.5-week online course. Each survey measured students’ perceptions of course LMS dialog, perceptions of instructor practices, and peer support for completing the course. Participants also reported their intentions to persist in the course during each survey administration. A multi-level modeling analysis revealed that LMS dialog, perceptions of instructor practices, and peer support are related to course persistence intentions. Time was also a significant predictor of persistence intentions and indicated that the course persistence intentions decrease towards the end of the course. Additionally, interactions between demographic variables and other predictors (perceptions of course LMS dialog, perceptions of instructor practices, and perceptions of peer support) were significant. With the increase in perceptions of course LMS dialog, perceptions of instructor practices, and perceptions of peer support, there was a relatively smaller increase in the persistence intentions of veterans than non-veterans. There is relatively more increase in the persistence intentions of females than males as their perceptions of instructor practices increase. Finally, increasing perceptions of peer support led to a relatively larger increase in the persistence intentions of non-transfer students than transfer students and a relatively smaller increase in persistence intentions of students working full-time than other students. 
    more » « less
  5. Motivation: This is a complete paper. There was a sudden shift from traditional learning to online learning in Spring 2020 with the outbreak of COVID-19. Although online learning is not a new topic of discussion, universities, faculty, and students were not prepared for this sudden change in learning. According to a recent article in ‘The Chronicle of Higher Education, “even under the best of circumstances, virtual learning requires a different, carefully crafted approach to engagement”. The Design Thinking course under study is a required freshmen level course offered in a Mid-western University. The Design Thinking course is offered in a flipped format where all the content to be learned is given to students beforehand and the in-class session is used for active discussions and hands-on learning related to the content provided at the small group level. The final learning objective of the course is a group project where student groups are expected to come up with functional prototypes to solve a real-world problem following the Design Thinking process. There were eighteen sections of the Design Thinking course offered in Spring 2020, and with the outbreak of COVID-19, a few instructors decided to offer synchronous online classes (where instructors were present online during class time and provided orientation and guidance just like a normal class) and a few others decided to offer asynchronous online classes (where orientation from the instructor was delivered asynchronous and the instructor was online during officially scheduled class time but interactions were more like office hours). Students were required to be present synchronously at the team level during the class time in a synchronous online class. In an asynchronous online class, students could be synchronous at the team level to complete their assignment any time prior to the deadline such that they could work during class time but they were not required to work at that time. Through this complete paper, we are trying to understand student learning, social presence and learner satisfaction with respect to different modes of instruction in a freshmen level Design Thinking course. Background: According to literature, synchronous online learning has advantages such as interaction, a classroom environment, and better course quality whereas asynchronous online learning has advantages such as self-controlled and self-directed learning. The disadvantages of synchronous online learning include the learning process, technology issues, and distraction. Social isolation, lack of interaction, and technology issue are a few disadvantages related to asynchronous online learning. Problem Being Addressed: There is a limited literature base investigating different modes of online instruction in a Design Thinking course. Through this paper, we are trying to understand and share the effectiveness of synchronous and asynchronous modes of instruction in an online Flipped Design Thinking Course. The results of the paper could also help in this time of pandemic by shedding light on the more effective way to teach highly active group-based classrooms for better student learning, social presence, and learner satisfaction. Method/Assessment: An end of semester survey was monitored in Spring 2020 to understand student experiences in synchronous and asynchronous Design Thinking course sections. The survey was sent to 720 students enrolled in the course in Spring 2020 and 324 students responded to the survey. Learning was measured using the survey instrument developed by Walker (2003) and the social presence and learner satisfaction was measured by the survey modified by Richardson and Swan (2003). Likert scale was used to measure survey responses. Anticipated Results: Data would be analyzed and the paper would be completed by draft paper submission. As the course under study is a flipped and active course with a significant component of group work, the anticipated results after analysis could be that one mode of instruction has higher student learning, social presence, and learner satisfaction compared to the other. 
    more » « less