skip to main content

This content will become publicly available on December 21, 2023

Title: You asked, now what? Modeling Students’ Help-Seeking and Coding actions from Re- quest to Resolution
Demand for education in Computer Science has increased markedly in recent years. With increased demand has come to an increased need for student support, especially for courses with large programming projects. Instructors commonly provide online post forums or office hours to address this massive demand for help requests. Identifying what types of questions students are asking in those interactions and what triggers their help requests can in turn assist instructors in better managing limited help-providing resources. In this study, we aim to explore students’ help-seeking actions from the two separate approaches we mentioned before and investigate their coding actions before help requests to understand better what motivates students to seek help in programming projects. We collected students’ help request data and commit logs from two Fall offerings of a CS2 course. In our analysis, we first believe that different types of questions should be related to different behavioral patterns. Therefore, we first categorized students’ help requests based on their content (e.g., Implementation, General Debugging, or Addressing Teaching Staff (TS) Test Failures). We found that General Debugging is the most frequently asked question. Then we analyzed how the popularity of each type of request changed over time. Our results suggest that more » implementation is more popular in the early stage of the project cycle, and it changes to General Debugging and Addressing TS Failures in the later stage. We also calculated the accuracy of students’ commit frequency one hour before their help requests; the results show that before Implementation requests, the commit frequency is significantly lower, and before TS failure requests, the frequency is significantly higher. Moreover, we checked before any help request whether students changed their source code or test code. The results show implementation requests related to higher chances of source code changes and coverage questions related to more test code changes. Moreover, we use a Markov Chain model to show students’ action sequences before, during, and after the requests. And finally, we explored students’ progress after the office hours interaction and found that over half of the students improved the correctness of their code after 20 minutes of their office hours interaction addressing TS failures ends. « less
; ; ; ; ;
Nigel Bosch; Antonija Mitrovic; Agathe Merceron
Award ID(s):
Publication Date:
Journal Name:
Journal of educational data mining
Page Range or eLocation-ID:
Sponsoring Org:
National Science Foundation
More Like this
  1. Mitrovic, A. ; Bosch, N. (Ed.)
    In computer science education timely help seeking during large programming projects is essential for student success. Help-seeking in typical courses happens in office hours and through online forums. In this research, we analyze students coding activities and help requests to understand the interaction between these activities. We collected student’s help requests during coding assignments on two different platforms in a CS2 course, and categorized those requests into eight categories (including implementation, addressing test failures, general debugging, etc.). Then we analyzed the proportion of each type of requests and how they changed over time. We also collected student’s coding status (including what part of the code changed and the frequency of commits) before they seek help to investigate if students share a similar code change behavior leading to certain type of help requests.
  2. Akram, Bita ; Shi, Yang ; Brusilovsky, Peter ; I-han Hsiao, Sharon ; Leinonen, Juho (Ed.)
    Promptly addressing students’ help requests on their programming assignments has become more and more challenging in computer science education. Since the pandemic, most instructors use online office hours to answer questions. Prior studies have shown increased student participation with online office hours. This popularity has led to significantly longer wait times in the office hours queue, and various strategies for selecting the next student to help may impact wait time. For example, prioritizing students who have not been seen on the day of the deadline will extend the wait time for students who are frequently rejoining the queue. To better understand this problem, we explored students’ behavior when they are waiting in the queue. We investigate the amount of time students are willing to wait in the queue by modeling the distribution of cancellation time. We find that after waiting for 49 minutes, most students will cancel their help request. Then, we looked at students’ coding actions during the waiting period and found that only 21% of students have commits while waiting. Surprisingly, students who waited for hours did not commit their work for automated feedback. Our findings suggest that time in the queue should be considered in addition tomore »other factors like last interaction when selecting the next student to help during office hours to minimize canceled interactions.« less
  3. Hsiao, I-Han ; Sahebi, Shaghayegh ; Bouchet, Francois ; Vie, Jill-Jenn (Ed.)
    As Computer Science has increased in popularity so too have class sizes and demands on faculty to provide support. It is therefore more important than ever for us to identify new ways to triage student questions, identify common problems, target students who need the most help, and better manage instructors’ time. By analyzing interaction data from office hours we can identify common patterns, and help to guide future help-seeking. My Digital Hand (MDH) is an online ticketing system that allows students to post help requests, and for instructors to prioritize support and track common issues. In this research, we have collected and analyzed a corpus of student questions from across six semesters of a CS2 with a focus on object-oriented programming course [17]. As part of this work, we grouped the interactions into five categories, analyzed the distribution of help requests, balanced the categories by Synthetic Minority Oversampling Technique (SMOTE) , and trained an automatic classifier based upon LightGBM to automatically classify student requests. We found that over 69% of the questions were unclear or barely specified. We proved the stability of the model across semesters through leave one out cross-validation and the target model achieves an accuracy of 91.8%.more »Finally, we find that online office hours can provide more help for more students.« less
  4. Student perceptions of the complete online transition of two CS courses in response to the COVID-19 pandemic Due to the COVID-19 pandemic, universities across the globe switched from traditional Face-to-Face (F2F) course delivery to completely online. Our university declared during our Spring break that students would not return to campus, and that all courses must be delivered fully online starting two weeks later. This was challenging to both students and instructors. In this evidence-based practice paper, we present results of end-of-semester student surveys from two Spring 2020 CS courses: a programming intensive CS2 course, and a senior theory course in Formal Languages and Automata (FLA). Students indicated course components they perceived as most beneficial to their learning, before and then after the online transition, and preferences for each regarding online vs. F2F. By comparing student reactions across courses, we gain insights on which components are easily adapted to online delivery, and which require further innovation. COVID was unfortunate, but gave a rare opportunity to compare students’ reflections on F2F instruction with online instructional materials for half a semester vs. entirely online delivery of the same course during the second half. The circumstances are unique, but we were able to acquiremore »insights for future instruction. Some course components were perceived to be more useful either before or after the transition, and preferences were not the same in the two courses, possibly due to differences in the courses. Students in both courses found prerecorded asynchronous lectures significantly less useful than in-person lectures. For CS2, online office hours were significantly less useful than in-person office hours, but we found no significant difference in FLA. CS2 students felt less supported by their instructor after the online transition, but no significant difference was indicated by FLA students. FLA students found unproctored online exams offered through Canvas more stressful than in-person proctored exams, but the opposite was indicated by CS2 students. CS2 students indicated that visual materials from an eTextbook were more useful to them after going online than before, but FLA students indicated no significant difference. Overall, students in FLA significantly preferred the traditional F2F version of the course, while no significant difference was detected for CS2 students. We did not find significant effects from gender on the preference of one mode over the other. A serendipitous outcome was learning that some changes forced by circumstance should be considered for long term adoption. Offering online lab sessions and online exams where the questions are primarily multiple choice are possible candidates. However, we found that students need to feel the presence of their instructor to feel properly supported. To determine what course components need further improvement before transitioning to fully online mode, we computed a logistic regression model. The dependent variable is the student's preference for F2F or fully online. The independent variables are the course components before and after the online transition. For both courses, in-person lectures were a significant factor negatively affecting students' preferences of the fully online mode. Similarly, for CS2, in-person labs and in-person office hours were significant factors pushing students’ preferences toward F2F mode.« less
  5. In Computer Science (CS) education, instructors use office hours for one-on-one help-seeking. Prior work has shown that traditional in-person office hours may be underutilized. In response many instructors are adding or transitioning to virtual office hours. Our research focuses on comparing in-person and online office hours to investigate differences between performance, interaction time, and the characteristics of the students who utilize in-person and virtual office hours. We analyze a rich dataset covering two semesters of a CS2 course which used in-person office hours in Fall 2019 and virtual office hours in Fall 2020. Our data covers students' use of office hours, the nature of their questions, and the time spent receiving help as well as demographic and attitude data. Our results show no relationship between student's attendance in office hours and class performance. However we found that female students attended office hours more frequently, as did students with a fixed mindset in computing, and those with weaker skills in transferring theory to practice. We also found that students with low confidence in or low enjoyment toward CS were more active in virtual office hours. Finally, we observed a significant correlation between students attending virtual office hours and an increased interestmore »in CS study; while students attending in-person office hours tend to show an increase in their growth mindset.« less