In the United States, the onset of COVID-19 triggered a nationwide lockdown, which forced many universities to move their primary assessments from invigilated in-person exams to unproctored online exams. This abrupt change occurred midway through the Spring 2020 semester, providing an unprecedented opportunity to investigate whether online exams can provide meaningful assessments of learning relative to in-person exams on a per-student basis. Here, we present data from nearly 2,000 students across 18 courses at a large Midwestern University. Using a meta-analytic approach in which we treated each course as a separate study, we showed that online exams produced scores that highly resembled those from in-person exams at an individual level despite the online exams being unproctored—as demonstrated by a robust correlation between online and in-person exam scores. Moreover, our data showed that cheating was either not widespread or ineffective at boosting scores, and the strong assessment value of online exams was observed regardless of the type of questions asked on the exam, the course level, academic discipline, or class size. We conclude that online exams, even when unproctored, are a viable assessment tool.
more »
« less
Cheating and Chegg: a Retrospective
In the spring of 2020, universities across America, and the world, abruptly transitioned to online learning. The online transition required faculty to find novel ways to administer assessments and in some cases, for students to utilize novel ways of cheating in their classes. The purpose of this paper is to provide a retrospective on cheating during online exams in the spring of 2020. It specifically looks at honor code violations in a sophomore level engineering course that enrolled more than 200 students. In this particular course, four pre-COVID assessments were given in class and six mid-COVID assessments were given online. This paper examines the increasing rate of cheating on these assessments and the profiles of the students who were engaged in cheating. It compares students who were engaged in violations of the honor code by uploading exam questions vs. those who those who looked at solutions to uploaded questions. This paper also looks at the abuse of Chegg during exams and the responsiveness of Chegg’s honor code team. It discusses the effectiveness of Chegg’s user account data in pursuing academic integrity cases. Information is also provided on the question response times for Chegg tutors in answering exam questions and the actual efficacy of cheating in this fashion.
more »
« less
- Award ID(s):
- 2013286
- PAR ID:
- 10290782
- Date Published:
- Journal Name:
- 2021 ASEE Annual Conference
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
ABSTRACT The article documents students’ experiences with the shift online at the onset of the COVID-19 pandemic and provides informed recommendations to STEM instructors regarding academic integrity and student stress. Over 500 students were surveyed on these topics, including an open-ended question. Students experienced more stress and perceived a greater workload in online courses and therefore preferred in-person courses overall. Personal awareness of cheating during online exams is positively correlated with the proportion of cheating a student perceives. Fear of getting caught is the best cheating deterrent while getting a better grade makes cheating most enticing. Randomization of questions and answer choices is perceived as a highly effective tool to reduce cheating and is reported as the least stress-inducing method. Inability to backtrack and time limits cause students the most stress. Students report that multiple choice questions are the least effective question type to discourage cheating and oral exam questions cause the most stress. Use of camera and lockdown browser or being video- and audio- recorded caused the majority of student stress. Yet, nearly 60% agree that the combination of camera and lockdown browser is an effective deterrent. Recommendations: (i) Be transparent regarding academic dishonesty detection methods and penalties. (ii) Use online invigilating tools. (iii) Synchronize exams and (iv) randomize exam questions. (v) Allow backtracking. (vi) Avoid converting in-person exams to online exams; instead, explore new ways of designing exams for the online environment.more » « less
-
null (Ed.)To defend against collaborative cheating in code writing questions, instructors of courses with online, asynchronous exams can use the strategy of question variants. These question variants are manually written questions to be selected at random during exam time to assess the same learning goal. In order to create these variants, currently the instructors have to rely on intuition to accomplish the competing goals of ensuring that variants are different enough to defend against collaborative cheating, and yet similar enough where students are assessed fairly. In this paper, we propose data-driven investigation into these variants. We apply our data-driven investigation into a dataset of three midterm exams from a large introductory programming course. Our results show that (1) observable inequalities of student performance exist between variants and (2) these differences are not just limited to score. Our results also show that the information gathered from our data-driven investigation can be used to provide recommendations for improving design of future variants.more » « less
-
null (Ed.)Proctoring educational assessments (e.g., quizzes and exams) has a cost, be it in faculty (and/or course staff) time or in money to pay for proctoring services. Previous estimates of the utility of proctoring (generally by estimating the score advantage of taking an exam without proctoring) vary widely and have mostly been implemented using an across subjects experimental designs and sometimes with low statistical power. We investigated the score advantage of unproctored exams versus proctored exams using a within-subjects design for N = 510 students in an on-campus introductory programming course with 5 proctored exams and 4 unproctored exams. We found that students scored 3.32 percentage points higher on questions on unproctored exams than on proctored exams (p < 0.001). More interestingly, however, we discovered that this score advantage on unproctored exams grew steadily as the semester progressed, from around 0 percentage points at the start of semester to around 7 percentage points by the end. As the most obvious explanation for this advantage is cheating, we refer to this behavior as the student population "learning to cheat". The data suggests that both more individuals are cheating and the average benefit of cheating is increasing over the course of the semester. Furthermore, we observed that studying for unproctored exams decreased over the course of the semester while studying for proctored exams stayed constant. Lastly, we estimated the score advantage by question type and found that our long-form programming questions had the highest score advantage on unproctored exams, but there are multiple possible explanations for this finding.more » « less
-
Preparing for high-stakes exams in introductory physics courses is generally a self-regulated activity. Compared to other exam reviewing strategies, doing practice exams has been shown to help students recognize gaps in their knowledge, encourage active practicing, and produce long-term retention. However, many students, particularly students who are struggling with the course material, are not guided by research-based study strategies and do not use practice exams effectively. Using data collected from a fully online course in Spring 2021, this study examines two interventions aimed at improving student selfregulated studying behaviors and enhancing student metacognition during exam preparation. We found that a modified format of online practice exams with one attempt per question and delayed feedback, increases the accuracy of feedback about student readiness for exams but does not change the accuracy of their predicted exam scores or studying behaviors. Additionally, an added mock exam one week before the actual exam impacts students’ intentions for studying but does not impact actual study behaviors or facilitate metacognition. These results suggest that interventions designed to improve exam preparation likely need to include explicit instruction on study strategies and student beliefs about learning.more » « less
An official website of the United States government

