- Award ID(s):
- 2021190
- NSF-PAR ID:
- 10488172
- Publisher / Repository:
- PHYSICAL REVIEW PHYSICS EDUCATION RESEARCH
- Date Published:
- Journal Name:
- Physical Review Physics Education Research
- Volume:
- 19
- Issue:
- 1
- ISSN:
- 2469-9896
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Henderson, Charles (Ed.)Preparing for high-stakes exams in introductory physics courses is generally a self-regulated activity. Compared to other exam reviewing strategies, doing practice exams has been shown to help students recognize gaps in their knowledge, encourage active practicing, and produce long-term retention. However, many students, particularly students who are struggling with the course material, are not guided by research-based study strategies and do not use practice exams effectively. Using data collected from a fully online course in Spring 2021, this study examines two interventions aimed at improving student selfregulated studying behaviors and enhancing student metacognition during exam preparation. We found that a modified format of online practice exams with one attempt per question and delayed feedback, increases the accuracy of feedback about student readiness for exams but does not change the accuracy of their predicted exam scores or studying behaviors. Additionally, an added mock exam one week before the actual exam impacts students’ intentions for studying but does not impact actual study behaviors or facilitate metacognition. These results suggest that interventions designed to improve exam preparation likely need to include explicit instruction on study strategies and student beliefs about learning.more » « less
-
Exam preparation in introductory science courses is self-regulated. Practice testing has been shown to produce better learning then other strategies. However, many students do not use practice tests effectively when studying. This mixed-methods study examines two experiments aimed at improving examining student predictions about learning and studying. We found that scores on a mock exam impact students’ intentions for studying but not study habits. We also found that many underperforming students initially increase the use of ineffective study strategies rather than adopt a strategy change. Students who distribute studying throughout the semester and engage with course concepts more deeply demonstrate improvement and increased satisfaction. These results suggest that exam preparation interventions may need to include study strategy and metacognitive instruction.more » « less
-
null (Ed.)We explore how course policies affect students' studying and learning when a second-chance exam is offered. High-stakes, one-off exams remain a de facto standard for assessing student knowledge in STEM, despite compelling evidence that other assessment paradigms such as mastery learning can improve student learning. Unfortunately, mastery learning can be costly to implement. We explore the use of optional second-chance testing to sustainably reap the benefits of mastery-based learning at scale. Prior work has shown that course policies affect students' studying and learning but have not compared these effects within the same course context. We conducted a quasi-experimental study in a single course to compare the effect of two grading policies for second-chance exams and the effect of increasing the size of the range of dates for students taking asynchronous exams. The first grading policy, called 90-cap, allowed students to optionally take a second-chance exam that would fully replace their score on a first-chance exam except the second-chance exam would be capped at 90% credit. The second grading policy, called 90-10, combined students' first- and second-chance exam scores as a weighted average (90% max score + 10% min score). The 90-10 policy significantly increased the likelihood that marginally competent students would take the second-chance exam. Further, our data suggests that students learned more under the 90-10 policy, providing improved student learning outcomes at no cost to the instructor. Most students took exams on the last day an exam was available, regardless of how many days the exam was available.more » « less
-
The metacognitive strategies of planning, monitoring, and evaluating can be promoted through systematic reflection to drive self-directed, lifelong learning. This article reports on a three-year study on systematic written reflection within an undergraduate Fluid Mechanics course to promote planning, monitoring, and evaluation. Students were prompted weekly to reflect on their in-class problem-solving, classroom and exam preparation, performance, behaviors, and learning in a flipped classroom at a large southeastern U.S. university. In addition, they received intentional instruction on how to plan, monitor, and evaluate their problem-solving during class. To enable a comparative assessment, a flipped classroom without these interventions was also implemented as a non-experimental cohort. The cohorts were compared using a final exam, concept inventory, and the Metacognitive Activities Inventory (MCAI). The MCAI indicated a significantly higher positive change (pre- to post-course) in self-regulatory behavior for the experimental cohort ( p = 0.037). The weekly reflections were studied using an inductive content analysis to assess students’ self-regulatory behaviors. They were also used to investigate statistical associations between reflection content and course outcomes. This revealed that academic self-discipline via planning, monitoring one's work, or being careful and diligent may be as aligned with course performance in STEM as is practice with the problem-solving itself. The effects for the final exam in the experimental cohort were positive overall as well as statistically or practically significant for various demographic strata. These results provided evidence for the potential enhancement of course performance with metacognition support. A positive shift in students’ perspectives regarding the value of the reflection questions was observed throughout the study. Therefore, as an implementation guide for other educators, the reflection questions and any changes made in posing them to students are discussed chronologically. Overall, the study points to the desirability of providing metacognition support in a STEM course.
-
In the United States, the onset of COVID-19 triggered a nationwide lockdown, which forced many universities to move their primary assessments from invigilated in-person exams to unproctored online exams. This abrupt change occurred midway through the Spring 2020 semester, providing an unprecedented opportunity to investigate whether online exams can provide meaningful assessments of learning relative to in-person exams on a per-student basis. Here, we present data from nearly 2,000 students across 18 courses at a large Midwestern University. Using a meta-analytic approach in which we treated each course as a separate study, we showed that online exams produced scores that highly resembled those from in-person exams at an individual level despite the online exams being unproctored—as demonstrated by a robust correlation between online and in-person exam scores. Moreover, our data showed that cheating was either not widespread or ineffective at boosting scores, and the strong assessment value of online exams was observed regardless of the type of questions asked on the exam, the course level, academic discipline, or class size. We conclude that online exams, even when unproctored, are a viable assessment tool.