skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 11:00 PM ET on Thursday, October 10 until 2:00 AM ET on Friday, October 11 due to maintenance. We apologize for the inconvenience.


This content will become publicly available on June 26, 2025

Title: Examining the role of assignment design and peer review on student responses and revisions to an organic chemistry writing-to-learn assignment

Research on student learning in organic chemistry indicates that students tend to focus on surface level features of molecules with less consideration of implicit properties when engaging in mechanistic reasoning. Writing-to-learn (WTL) is one approach for supporting students’ mechanistic reasoning. A variation of WTL incorporates peer review and revision to provide opportunities for students to interact with and learn from their peers, as well as revisit and reflect on their own knowledge and reasoning. However, research indicates that the rhetorical features included in WTL assignments may influence the language students use in their responses. This study utilizes machine learning to characterize the mechanistic features present in second-semester undergraduate organic chemistry students’ responses to two versions of a WTL assignment with different rhetorical features. Furthermore, we examine the role of peer review on the mechanistic reasoning captured in students’ revised drafts. Our analysis indicates that students include both surface level and implicit features of mechanistic reasoning in their drafts and in the feedback to their peers, with slight differences depending on the rhetorical features present in the assignment. However, students’ revisions appeared to be primarily connected to the peer review processviathe presence of surface features in the drafts students read (as opposed to the feedback received). These findings indicate that further scaffolding focused on how to utilize information gained from the peer review process (i.e., both feedback received and drafts read) and emphasizing implicit properties could help support the utility of WTL for developing students’ mechanistic reasoning in organic chemistry.

 
more » « less
Award ID(s):
2121123
NSF-PAR ID:
10540549
Author(s) / Creator(s):
; ;
Editor(s):
Lewis, Scott
Publisher / Repository:
Royal Society of Chemistry
Date Published:
Journal Name:
Chemistry Education Research and Practice
Volume:
25
Issue:
3
ISSN:
1109-4028
Page Range / eLocation ID:
721 to 741
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Peer review is useful for providing students with formative feedback, yet it is used less frequently in STEM classrooms and for supporting writing-to-learn (WTL). While research indicates the benefits of incorporating peer review into classrooms, less research is focused on students’ perceptions thereof. Such research is important as it speaks to the mechanisms whereby peer review can support learning. This study examines students’ self-reported approaches to and perceptions of peer review and revision associated with WTL assignments implemented in an organic chemistry course. Students responded to a survey covering how they approached peer review and revision and the benefits they perceived from participating in each. Findings indicate that the assignment materials guided students’ approaches during both peer review and revision. Furthermore, students described various ways both receiving feedback from their peers and reading their peers’ drafts were beneficial, but primarily connected their revisions to receiving feedback.

     
    more » « less
  2. East, Martin ; Slomp, David (Ed.)
    Studies examining peer review demonstrate that students can learn from giving feedback to and receiving feedback from their peers, especially when they utilize information gained from the review process to revise. However, much of the research on peer review is situated within the literature regarding how students learn to write. With an increasing use of writing-to-learn in STEM classrooms, it is important to study how students engage in peer review for these types of writing assignments. This study sought to better understand how peer review and revision can support student learning for writing-to-learn specifically, using the lenses of cognitive perspectives of writing and engagement with written corrective feedback. Using a case study approach, we provide a detailed analysis of six students’ written artifacts in response to a writing-to-learn assignment that incorporated peer review and revision implemented in an organic chemistry course. Students demonstrated a range in the types of revisions they made and the extent to which the peer review process informed their revisions. Additionally, students exhibited surface, midlevel, and active engagement with the peer review and revision process. Considering the different engagement levels can inform how we frame peer review to students when using it as an instructional practice. 
    more » « less
  3. Peer assessment, as a form of collaborative learning, can engage students in active learning and improve their learning gains. However, current teaching platforms and programming environments provide little support to integrate peer assessment for in-class programming exercises. We identified challenges in conducting such exercises and adopting peer assessment through formative interviews with instructors of introductory programming courses. To address these challenges, we introduce PuzzleMe, a tool to help Computer Science instructors to conduct engaging in-class programming exercises. PuzzleMe leverages peer assessment to support a collaboration model where students provide timely feedback on their peers' work. We propose two assessment techniques tailored to in-class programming exercises: live peer testing and live peer code review. Live peer testing can improve students' code robustness by allowing them to create and share lightweight tests with peers. Live peer code review can improve code understanding by intelligently grouping students to maximize meaningful code reviews. A two-week deployment study revealed that PuzzleMe encourages students to write useful test cases, identify code problems, correct misunderstandings, and learn a diverse set of problem-solving approaches from peers. 
    more » « less
  4. In this paper, we present a science writing assignment in which students focus on targeting specific audiences when writing about a socioscientific issue as well as participate in a peer review process. This assignment helps students consider inclusive science communication in their writing, focusing on engaging unique audiences about the intersections of science and social justice. Students are introduced to evidence-based tools for formulating communication for unique audiences as well as for assessment of writing quality. This assignment is novel in that it helps students think about inclusion issues in STEM, science writing, and peer review, all of which are key disciplinary skills that are not always included in STEM courses. While this assignment was piloted in chemistry and environmental engineering courses, this assignment could easily be modified for other disciplines.

     
    more » « less
  5. Abstract

    The ability to analyze arguments is critical for higher-level reasoning, yet previous research suggests that standard university education provides only modest improvements in students’ analytical-reasoning abilities. What pedagogical approaches are most effective for cultivating these skills? We investigated the effectiveness of a 12-week undergraduate seminar in which students practiced a software-based technique for visualizing the logical structures implicit in argumentative texts. Seminar students met weekly to analyze excerpts from contemporary analytic philosophy papers, completed argument visualization problem sets, and received individualized feedback on a weekly basis. We found that seminar students improved substantially more on LSAT Logical Reasoning test forms than did control students (d = 0.71, 95% CI: [0.37, 1.04],p < 0.001), suggesting that learning how to visualize arguments in the seminar led to large generalized improvements in students’ analytical-reasoning skills. Moreover, blind scoring of final essays from seminar students and control students, drawn from a parallel lecture course, revealed large differences in favor of seminar students (d = 0.87, 95% CI: [0.26, 1.48],p = 0.005). Seminar students understood the arguments better, and their essays were more accurate and effectively structured. Taken together, these findings deepen our understanding of how visualizations support logical reasoning and provide a model for improving analytical-reasoning pedagogy.

     
    more » « less