skip to main content


Title: Supporting Instructors Adoption of Peer Instruction
Peer Instruction (PI) is a lecture-based active learning approach that has students solve a difficult multiple-choice question individually, submit their answer, discuss their answer with peers, and then submit their answer again. Despite plentiful evidence to support its effectiveness, PI has not been widely adopted by undergraduate computing instructors due to low awareness of PI, the effort needed to create PI questions, the limited instructional time needed for PI activities during lectures, and potential adverse reactions from students. We hypothesized that we could allay some of these concerns by hosting a three-day summer workshop on Peer Instruction for instructors and building and sharing a free tool and a question bank that supports PI in an open-source ebook platform. We invited eighteen instructors to attend an in-person three-day workshop on PI in the summer of 2022. We collected their feedback by using pre and post surveys and conducting semi-structured interviews. We report on the effect of the three-day summer workshop on instructor attitudes towards and knowledge of PI, the barriers that prevented instructors from adopting the free tool, and feedback from instructors who used the tool. The results show that most workshop attendees reported that they planned to use the tool in the fall semester, but less than half actually did. Responses from both users and non-users yield insights about the support instructors need to adopt new tools. This research informs future professional development workshops, tool development, and how to better support instructors interested in adopting Peer Instruction.  more » « less
Award ID(s):
2043207
NSF-PAR ID:
10510820
Author(s) / Creator(s):
; ;
Publisher / Repository:
ACM
Date Published:
Journal Name:
Proceedings of the 55th ACM Technical Symposium on Computer Science Education V. 2
ISBN:
9798400704246
Page Range / eLocation ID:
1662 to 1663
Format(s):
Medium: X
Location:
Portland OR USA
Sponsoring Org:
National Science Foundation
More Like this
  1. Peer assessment, as a form of collaborative learning, can engage students in active learning and improve their learning gains. However, current teaching platforms and programming environments provide little support to integrate peer assessment for in-class programming exercises. We identified challenges in conducting such exercises and adopting peer assessment through formative interviews with instructors of introductory programming courses. To address these challenges, we introduce PuzzleMe, a tool to help Computer Science instructors to conduct engaging in-class programming exercises. PuzzleMe leverages peer assessment to support a collaboration model where students provide timely feedback on their peers' work. We propose two assessment techniques tailored to in-class programming exercises: live peer testing and live peer code review. Live peer testing can improve students' code robustness by allowing them to create and share lightweight tests with peers. Live peer code review can improve code understanding by intelligently grouping students to maximize meaningful code reviews. A two-week deployment study revealed that PuzzleMe encourages students to write useful test cases, identify code problems, correct misunderstandings, and learn a diverse set of problem-solving approaches from peers. 
    more » « less
  2. Peer feedback is a central activity for project-based design education. The prevalence of devices carried by students and the emergence of novel peer feedback systems enables the possibility of collecting and sharing feedback immediately between students during class. However, pen and paper is thought to be more familiar, less distracting for students, and easier for instructors to implement and manage. To evaluate the efficacy of in-class digital feedback systems, we conducted a within-subjects study with 73 students during two weeks of a game design course. After short student presentations, while instructors provided verbal feedback, peers provided feedback either on paper or through a device. The study found that both methods yielded comments of similar quality and quantity, but the digital approach provided additional ways for students to participate and required less effort from the instructors. While both methods produced similar behaviors, students held inaccurate perceptions about their behavior with each method. We discuss design implications for technologies to support in-class feedback exchange. 
    more » « less
  3. In online or large in-person course sections, instructors often adopt an online homework tool to alleviate the burden of grading. While these systems can quickly tell students whether they got a problem correct for a multiple-choice or numeric answer, they are unable to provide feedback on students’ free body diagrams. As the process of sketching a free body diagram correctly is a foundational skill to solving engineering problems, the loss of feedback to the students in this area is a detriment to students. To address the need for rapid feedback on students’ free body diagram sketching, the research team developed an online, sketch-recognition system called Mechanix. This system allows students to sketch free body diagrams, including for trusses, and receive instant feedback on their sketches. The sketching feedback is ungraded. After the students have a correct sketch, they are then able to enter in the numeric answers for the problem and submit those for a grade. Thereby, the platform offers the grading convenience of other online homework systems but also helps the students develop their free body diagram sketching skills. To assess the efficacy of this experimental system, standard concept inventories were administered pre- and post-semester for both experimental and control groups. The unfamiliarity or difficulty of some advanced problems in the Statics Concept Inventory, however, appeared to discourage students, and many would stop putting in any effort after a few problems that were especially challenging to solve. This effect was especially pronounced with the Construction majors versus the Mechanical Engineering majors in the test group. To address this tendency and therefore collect more complete pre- and post-semester concept inventory data, the research group worked on reordering the Statics Concept Inventory questions from more familiar to more challenging, based upon the past performance of the initial students taking the survey. This paper describes the process and results of the effort to reorder this instrument in order to increase Construction student participation and, therefore, the researchers’ ability to measure the impact of the Mechanix system. 
    more » « less
  4. In online or large in-person course sections, instructors often adopt an online homework tool to alleviate the burden of grading. While these systems can quickly tell students whether they got a problem correct for a multiple-choice or numeric answer, they are unable to provide feedback on students’ free body diagrams. As the process of sketching a free body diagram correctly is a foundational skill to solving engineering problems, the loss of feedback to the students in this area is a detriment to students. To address the need for rapid feedback on students’ free body diagram sketching, the research team developed an online, sketch-recognition system called Mechanix. This system allows students to sketch free body diagrams, including for trusses, and receive instant feedback on their sketches. The sketching feedback is ungraded. After the students have a correct sketch, they are then able to enter in the numeric answers for the problem and submit those for a grade. Thereby, the platform offers the grading convenience of other online homework systems but also helps the students develop their free body diagram sketching skills. To assess the efficacy of this experimental system, standard concept inventories were administered pre- and post-semester for both experimental and control groups. The unfamiliarity or difficulty of some advanced problems in the Statics Concept Inventory, however, appeared to discourage students, and many would stop putting in any effort after a few problems that were especially challenging to solve. This effect was especially pronounced with the Construction majors versus the Mechanical Engineering majors in the test group. To address this tendency and therefore collect more complete pre- and post-semester concept inventory data, the research group worked on reordering the Statics Concept Inventory questions from more familiar to more challenging, based upon the past performance of the initial students taking the survey. This paper describes the process and results of the effort to reorder this instrument in order to increase Construction student participation and, therefore, the researchers’ ability to measure the impact of the Mechanix system. 
    more » « less
  5. In online or large in-person course sections, instructors often adopt an online homework tool to alleviate the burden of grading. While these systems can quickly tell students whether they got a problem correct for a multiple-choice or numeric answer, they are unable to provide feedback on students’ free body diagrams. As the process of sketching a free body diagram correctly is a foundational skill to solving engineering problems, the loss of feedback to the students in this area is a detriment to students. To address the need for rapid feedback on students’ free body diagram sketching, the research team developed an online, sketch-recognition system called Mechanix. This system allows students to sketch free body diagrams, including for trusses, and receive instant feedback on their sketches. The sketching feedback is ungraded. After the students have a correct sketch, they are then able to enter in the numeric answers for the problem and submit those for a grade. Thereby, the platform offers the grading convenience of other online homework systems but also helps the students develop their free body diagram sketching skills. To assess the efficacy of this experimental system, standard concept inventories were administered pre- and post-semester for both experimental and control groups. The unfamiliarity or difficulty of some advanced problems in the Statics Concept Inventory, however, appeared to discourage students, and many would stop putting in any effort after a few problems that were especially challenging to solve. This effect was especially pronounced with the Construction majors versus the Mechanical Engineering majors in the test group. To address this tendency and therefore collect more complete pre- and post-semester concept inventory data, the research group worked on reordering the Statics Concept Inventory questions from more familiar to more challenging, based upon the past performance of the initial students taking the survey. This paper describes the process and results of the effort to reorder this instrument in order to increase Construction student participation and, therefore, the researchers’ ability to measure the impact of the Mechanix system. 
    more » « less