skip to main content


This content will become publicly available on July 1, 2024

Title: How Common are Common Wrong Answers? Crowdsourcing Remediation at Scale.
Solving mathematical problems is cognitively complex, involving strategy formulation, solution development, and the application of learned concepts. However, gaps in students’ knowledge or weakly grasped concepts can lead to errors. Teachers play a crucial role in predicting and addressing these difficulties, which directly influence learning outcomes. However, preemptively identifying misconcep- tions leading to errors can be challenging. This study leverages historical data to assist teachers in recognizing common errors and addressing gaps in knowledge through feedback. We present a longitudinal analysis of incorrect answers from the 2015-2020 aca- demic years on two curricula, Illustrative Math and EngageNY, for grades 6, 7, and 8. We find consistent errors across 5 years despite varying student and teacher populations. Based on these Common Wrong Answers (CWAs), we designed a crowdsourcing platform for teachers to provide Common Wrong Answer Feedback (CWAF). This paper reports on an in vivo randomized study testing the ef- fectiveness of CWAFs in two scenarios: next-problem-correctness within-skill and next-problem-correctness within-assignment, re- gardless of the skill. We find that receiving CWAF leads to a signifi- cant increase in correctness for consecutive problems within-skill. However, the effect was not significant for all consecutive problems within-assignment, irrespective of the associated skill. This paper investigates the potential of scalable approaches in identifying Com- mon Wrong Answers (CWAs) and how the use of crowdsourced CWAFs can enhance student learning through remediation.  more » « less
Award ID(s):
1950683 2118725
NSF-PAR ID:
10417271
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ;
Date Published:
Journal Name:
Learning@Scale 2023
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Solving mathematical problems is cognitively complex, involving strategy formulation, solution development, and the application of learned concepts. However, gaps in students' knowledge or weakly grasped concepts can lead to errors. Teachers play a crucial role in predicting and addressing these difficulties, which directly influence learning outcomes. However, preemptively identifying misconceptions leading to errors can be challenging. This study leverages historical data to assist teachers in recognizing common errors and addressing gaps in knowledge through feedback. We present a longitudinal analysis of incorrect answers from the 2015-2020 academic years on two curricula, Illustrative Math and EngageNY, for grades 6, 7, and 8. We find consistent errors across 5 years despite varying student and teacher populations. Based on these Common Wrong Answers (CWAs), we designed a crowdsourcing platform for teachers to provide Common Wrong Answer Feedback (CWAF). This paper reports on an in vivo randomized study testing the effectiveness of CWAFs in two scenarios: next-problem-correctness within-skill and next-problem-correctness within-assignment, regardless of the skill. We find that receiving CWAF leads to a significant increase in correctness for consecutive problems within-skill. However, the effect was not significant for all consecutive problems within-assignment, irrespective of the associated skill. This paper investigates the potential of scalable approaches in identifying Common Wrong Answers (CWAs) and how the use of crowdsourced CWAFs can enhance student learning through remediation. 
    more » « less
  2. The process of synthesizing solutions for mathematical problems is cognitively complex. Students formulate and implement strate- gies to solve mathematical problems, develop solutions, and make connections between their learned concepts as they apply their reasoning skills to solve such problems. The gaps in student knowl- edge or shallowly-learned concepts may cause students to guess at answers or otherwise apply the wrong approach, resulting in errors in their solutions. Despite the complexity of the synthesis process in mathematics learning, teachers’ knowledge and ability to anticipate areas of potential difficulty is essential and correlated with student learning outcomes. Preemptively identifying the common miscon- ceptions in students that result in subsequent incorrect attempts can be arduous and unreliable, even for experienced teachers. This pa- per aims to help teachers identify the subsequent incorrect attempts that commonly occur when students are working on math problems such that they can address the underlying gaps in knowledge and common misconceptions through feedback. We report on a longi- tudinal analysis of historical data, from a computer-based learning platform, exploring the incorrect answers in the prior school years (’15-’20) that establish the commonality of wrong answers on two Open Educational Resources (OER) curricula–Illustrative Math (IM) and EngageNY (ENY) for grades 6, 7, and 8. We observe that incor- rect answers are pervasive across 5 academic years despite changes in underlying student and teacher population. Building on our find- ings regarding the Common Wrong Answers (CWAs), we report on goals and task analysis that we leveraged in designing and develop- ing a crowdsourcing platform for teachers to write Common Wrong Answer Feedback (CWAF) aimed are remediating the underlying cause of the CWAs. Finally, we report on an in vivo study by analyz- ing the effectiveness of CWAFs using two approaches; first, we use next-problem-correctness as a dependent measure after receiving CWAF in an intent-to-treat second, using next-attempt correctness as a dependent measure after receiving CWAF in a treated analysis. With the rise in popularity and usage of computer-based learning platforms, this paper explores the potential benefits of scalability in identifying CWAs and the subsequent usage of crowd-sourced CWAFs in enhancing the student learning experience through re- mediation. 
    more » « less
  3. Prior work analyzing tutoring sessions provided evidence that highly effective tutors, through their interaction with students and their experience, can perceptively recognize incorrect processes or “bugs” when students incorrectly answer problems. Researchers have studied these tutoring interactions examining instructional approaches to address incorrect processes and observed that the format of the feedback can influence learning outcomes. In this work, we recognize the incorrect answers caused by these buggy processes as Common Wrong Answers (CWAs). We examine the ability of teachers and instructional designers to identify CWAs proactively. As teachers and instructional designers deeply understand the common approaches and mistakes students make when solving mathematical problems, we examine the feasibility of proactively identifying CWAs and generating Common Wrong Answer Feedback (CWAFs) as a formative feedback intervention for addressing student learning needs. As such, we analyze CWAFs in three sets of analyses. We first report on the accuracy of the CWAs predicted by the teachers and instructional designers on the problems across two activities. We then measure the effectiveness of the CWAFs using an intent-to-treat analysis. Finally, we explore the existence of personalization effects of the CWAFs for the students working on the two mathematics activities. 
    more » « less
  4. Prior work analyzing tutoring sessions provided evidence that highly effective tutors, through their interaction with students and their experience, can perceptively recognize incorrect processes or “bugs” when students incorrectly answer problems. Researchers have studied these tutoring interactions examining instructional approaches to address incorrect processes and observed that the format of the feedback can influence learning outcomes. In this work, we recognize the incorrect answers caused by these buggy processes as Common Wrong Answers (CWAs). We examine the ability of teachers and instructional designers to identify CWAs proactively. As teachers and instructional designers deeply understand the common approaches and mistakes students make when solving mathematical problems, we examine the feasibility of proactively identifying CWAs and generating Common Wrong Answer Feedback (CWAFs) as a formative feedback intervention for addressing student learning needs. As such, we analyze CWAFs in three sets of analyses. We first report on the accuracy of the CWAs predicted by the teachers and instructional designers on the problems across two activities.We then measure the effectiveness of the CWAFs using an intent-to-treat analysis. Finally, we explore the existence of personalization effects of the CWAFs for the students working on the two mathematics activities. 
    more » « less
  5. The 2021 return to face-to-face teaching and proctored exams revealed significant gaps in student learning during remote instruction. The challenge of supporting underperforming students is not expected to abate in the next 5-10 years as COVID-19-related learning losses compound structural inequalities in K-12 education. More recently, anecdotal evidence across courses shows declines in classroom attendance and student engagement. Lack of engagement indicates emotional barriers rather than intellectual deficiencies, and its growth coincides with the ongoing mental health epidemic. Regardless of the underlying reasons, professors are now faced with the unappealing choice of awarding failing grades to an uncomfortably large fraction of classes or awarding passing grades to students who do not seem prepared for the workforce or adult life in general. Faculty training, if it exists, addresses neither the scale of this situation nor the emotional/identity aspects of the problem. There is an urgent need for pedagogical remediation tools that can be applied without additional TA or staff resources, without training in psychiatry, and with only five or eight weeks remaining in the semester. This work presents two work-in-progress interventions for engineering faculty who face the challenges described above. In the first intervention, students can improve their exam score by submitting videos of reworked exams. The requirement of voiceover forces students to understand the thought process behind problems, even if they have copied the answers from a friend. Incorporating peer review into the assignment reduces the workload for instructor grading. This intervention has been successfully implemented in sophomore- and senior-level courses with positive feedback from both faculty and students. In the second intervention, students who fail the midterm are offered an automatic passing exam grade (typically 51%) in exchange for submitting a knowledge inventory and remediation plan. Students create a glossary of terms and concepts from the class and rank them by their level of understanding. Recent iterations of the remediation plan also include reflections on emotions and support networks. In February 2023, the project team will scale the interventions to freshman-level Introductory Programming, which has 400 students and the highest fail/withdrawal rate in the college. The large sample size will enable more robust statistics to correlate exam scores, intervention rubric items, and surveys on assignment effectiveness. Piloting interventions in a variety of environments and classes will establish best pedagogical practices that minimize instructors’ workload and decision fatigue. The ultimate goal of this project is to benefit students and faculty through well-defined and systematic interventions across the curriculum. 
    more » « less