ABSTRACT CONTEXT This paper examines an engineering dynamics course at Purdue University that was specifically designed to create an active, blended, and collaborative environment. In addition to in-person classes and support, students have access to blended content such as solution videos, mechanics visualizations, a course discussion forum, and interactive simulations. PURPOSE Many studies have shown that students’ engagement in an online discussion forum enhances their learning performance (Davies & Graff, 2005; Hrastinski, 2008). However, our previous research showed that students’ engagement in the online forum of our dynamics course differed significantly across students’ demographics. We showed that women, white, or Asian American students were more likely to be involved in online discussions than men, international, or Hispanic students (Duan et al., 2018). In this paper, we take the previous analysis further by examining whether the observed differences in online student engagement mediate or moderate student performance. APPROACH To answer our research question, we will first investigate the mediation effect by creating two models. A first model with race/international status as the mediating variable and gender identity as a control variable, and a second model with gender identity as the mediating variable and race/international status as a control. Second, we will investigate the moderation effect of demographic factors by creating a regression model including interaction terms to show the relationship of each demographic’s discussion forum engagement to overall performance. The goal of investigating these interaction terms is to determine if a moderating relationship exists where demographic factors impact online engagement, which in turn impact course performance. CONCLUSIONS We find that gender identity is the only significant demographic factor that moderates the effect of a student’s engagement on their performance. Based on the findings of our previous work, students of various racial and ethnic identities do engage differently in the discussion forum. However, this analysis was unable to detect any significant difference in student engagement based on demographics. Our paper contributes to understanding the mechanisms through which students’ engagement can translate into academic performance by focusing on their demographic background. The moderating role of students’ demographic background calls for a more targeted design of instructional tools in blended and collaborative environments to better support students from various demographic backgrounds.
more »
« less
PEDI-Piazza Explorer Dashboard for Intervention
Analytics about how students navigate online learning tools throughout the duration of an assignment is scarce. Knowledge about how students use online tools before a course’s end could positively impact students’ learning outcomes. We introduce PEDI (Piazza Explorer Dashboard for Intervention), a tool which analyzes and presents visualizations of forum activity on Piazza, a question and answer forum, to instructors. We outline the design principles and data-informed recommendations used to design PEDI. Our prior research revealed two critical periods in students’ forum engagement over the duration of an assignment. Early engagement in the first half of an assignment duration positively correlates with class average performance. Whereas, extremely high engagement toward the deadline predicted lower class average performance. PEDI uses these findings to detect and flag troubling engagement levels and informs instructors through clear visualizations to promote data-informed interventions. By providing insights to instructors, PEDI may improve class performance and pave the way for a new generation of online tools.
more »
« less
- Award ID(s):
- 1821475
- PAR ID:
- 10392591
- Editor(s):
- Harms, Kyle; Cunha, Jácome; Oney, Steve; Kelleher, Caitlin
- Date Published:
- Journal Name:
- 2021 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC)
- Page Range / eLocation ID:
- 1-4
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Purpose: We gathered examples from our extended collaboration to move educators move online while avoiding synchronous meetings. “gPortfolios” are public (to the class) pages where students write responses to carefully constructed engagement routines. Students then discuss their work with instructors and peers in threaded comments. gPortfolios usually include engagement reflections, formative self-assessments, and automated quizzes. These assessments support and document learning while avoiding instructor “burnout” from grading. gPortfolios can be implemented using Google Docs and Forms or any learning management system. Methodology. We report practical insights gained from design-based implementation research. This research explored the late Randi Engle’s principles for productive disciplinary engagement and expansive framing. Engle used current theories of learning to foster student discussions that were both authentic to the academic discipline at hand and productive for learning. This research also used new approaches to assessment to support Engle’s principles. This resulted in a comprehensive approach to online instruction and assessment that is effective and efficient for both students and teachers. Findings. Our approach “frames” (i.e., contextualizes) online engagement using each learners’ own experiences, perspectives, and goals. Writing this revealed how this was different in different courses. Secondary biology students framed each assignment independently. Secondary English and history students framed assignments as elements of a personalized capstone presentation; the history students further used a self-selected “historical theme.” Graduate students framed each assignment in an educational assessment course using a real or imagined curricular aim and context. Originality. Engle’s ideas have yet to be widely taken up in online education.more » « less
-
null (Ed.)Online forums are an integral part of modern day courses, but motivating students to participate in educationally beneficial discussions can be challenging. Our proposed solution is to initialize (or “seed”) a new course forum with comments from past instances of the same course that are intended to trigger discussion that is beneficial to learning. In this work, we develop methods for selecting high-quality seeds and evaluate their impact over one course instance of a 186-student biology class. We designed a scale for measuring the “seeding suitability” score of a given thread (an opening comment and its ensuing discussion). We then constructed a supervised machine learning (ML) model for predicting the seeding suitability score of a given thread. This model was evaluated in two ways: first, by comparing its performance to the expert opinion of the course instructors on test/holdout data; and second, by embedding it in a live course, where it was actively used to facilitate seeding by the course instructors. For each reading assignment in the course, we presented a ranked list of seeding recommendations to the course instructors, who could review the list and filter out seeds with inconsistent or malformed content. We then ran a randomized controlled study, in which one group of students was shown seeds that were recommended by the ML model, and another group was shown seeds that were recommended by an alternative model that ranked seeds purely by the length of discussion that was generated in previous course instances. We found that the group of students that received posts from either seeding model generated more discussion than a control group in the course that did not get seeded posts. Furthermore, students who received seeds selected by the ML-based model showed higher levels of engagement, as well as greater learning gains, than those who received seeds ranked by length of discussion.more » « less
-
Purpose As online course enrollments increase, it is important to understand how common course features influence students' behaviors and performance. Asynchronous online courses often include a discussion forum to promote community through interaction between students and instructors. Students interact both socially and cognitively; instructors' engagement often demonstrates social or teaching presence. Students' engagement in the discussions introduces both intrinsic and extraneous cognitive load. The purpose of this study is to validate an instrument for measuring cognitive load in asynchronous online discussions. Design/methodology/approach This study presents the validation of the NASA-TLX instrument for measuring cognitive load in asynchronous online discussions in an introductory physics course. Findings The instrument demonstrated reliability for a model with four subscales for all five discrete tasks. This study is foundational for future work that aims at testing the efficacy of interventions, and reducing extraneous cognitive load in asynchronous online discussions. Research limitations/implications Nonresponse error due to the unincentivized, voluntary nature of the survey introduces a sample-related limitation. Practical implications This study provides a strong foundation for future research focused on testing the effects of interventions aimed at reducing extraneous cognitive load in asynchronous online discussions. Originality/value This is a novel application of the NASA-TLX instrument for measuring cognitive load in asynchronous online discussions.more » « less
-
The recent public releases of AI tools such as ChatGPT have forced computer science educators to reconsider how they teach. These tools have demonstrated considerable ability to generate code and answer conceptual questions, rendering them incredibly useful for completing CS coursework. While overreliance on AI tools could hinder students’ learning, we believe they have the potential to be a helpful resource for both students and instructors alike. We propose a novel system for instructor-mediated GPT interaction in a class discussion board. By automatically generating draft responses to student forum posts, GPT can help Teaching Assistants (TAs) respond to student questions in a more timely manner, giving students an avenue to receive fast, quality feedback on their solutions without turning to ChatGPT directly. Additionally, since they are involved in the process, instructors can ensure that the information students receive is accurate, and can provide students with incremental hints that encourage them to engage critically with the material, rather than just copying an AI-generated snippet of code. We utilize Piazza—a popular educational forum where TAs help students via text exchanges—as a venue for GPT-assisted TA responses to student questions. These student questions are sent to GPT-4 alongside assignment instructions and a customizable prompt, both of which are stored in editable instructor-only Piazza posts. We demonstrate an initial implementation of this system, and provide examples of student questions that highlight its benefits.more » « less
An official website of the United States government

