Battestilli, Lina; Rebelsky, Samuel A; Shoop, Libby
(Ed.)
We compare the exam security of three proctoring regimens of Bring-Your-Own-Device, synchronous, computer-based exams in a computer science class: online un-proctored, online proctored via Zoom, and in-person proctored. We performed two randomized crossover experiments to compare these proctoring regimens. The first study measured the score advantage students receive while taking un-proctored online exams over Zoom-proctored online exams. The second study measured the score advantage of students taking Zoom-proctored online exams over in-person proctored exams. In both studies, students took six 50-minute exams using their own devices, which included two coding questions and 8–10 non-coding questions. We find that students score 2.3% higher on non-coding questions when taking exams in the un-proctored format compared to Zoom proctoring. No statistically significant advantage was found for the coding questions. While most of the non-coding questions had randomization such that students got different versions, for the few questions where all students received the same exact version, the score advantage escalated to 5.2%. From the second study, we find no statistically significant difference between students’ performance on Zoom-proctored vs. in-person proctored exams. With this, we recommend educators incorporate some form of proctoring along with question randomization to mitigate cheating concerns in BYOD exams.
more »
« less
An official website of the United States government

