skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Automated Analysis of Student Verbalizations in Online Learning Environments
We present results in automating the analysis of student verbalizations in online learning environments, using an existing online tool designed to teach students to reason analytically about code as an example. The new extension captures "think-aloud'' data as students work through code reasoning activities. The data is recorded and transcribed automatically and used as input to a natural language processing / machine learning system designed to identify specific student attitudes (e.g., uncertain), behaviors (e.g., guessing), and difficulties (e.g., concept misunderstandings). We present the design and implementation of the tool, an analysis of its transcription accuracy, and an evaluation of its utility in identifying characteristics of student learning.  more » « less
Award ID(s):
1914667
PAR ID:
10294495
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
SIGCSE '21: Proceedings of the 51st ACM Technical Symposium on Computer Science Education
Page Range / eLocation ID:
1272 to 1272
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Understanding the thought processes of students as they progress from initial (incorrect) answers toward correct answers is a challenge for instructors, both in this pandemic and beyond. This paper presents a general network visualization learning analytics system that helps instructors to view a sequence of answers input by students in a way that makes student learning progressions apparent. The system allows instructors to study individual and group learning at various levels of granularity. The paper illustrates how the visualization system is employed to analyze student responses collected through an intervention. The intervention is BeginToReason, an online tool that helps students learn and use symbolic reasoning-reasoning about code behavior through abstract values instead of concrete inputs. The specific focus is analysis of tool-collected student responses as they perform reasoning activities on code involving conditional statements. Student learning is analyzed using the visualization system and a post-test. Visual analytics highlights include instances where students producing one set of incorrect answers initially perform better than a different set and instances where student thought processes do not cluster well. Post-test data analysis provides a measure of student ability to apply what they have learned and their holistic understanding. 
    more » « less
  2. To promote understanding of and interest in working with data among diverse student populations, we developed and studied a high school mathematics curriculum module that examines income inequality in the United States. Designed as a multi-week set of applied data investigations, the module supports student analyses of income inequality using U.S. Census Bureau microdata and the online data analysis tool the Common Online Data Analysis Platform (CODAP). Pre- and post-module data show that use of this module was associated with statistically significant growth in students’ understanding of fundamental data concepts and individual interests in statistics and data analysis, with small to moderate effect sizes. Student survey responses and interview data from students and teachers suggest that the topic of income inequality, features within CODAP, the use of person-level data, and opportunities to engage in multivariable thinking helped to support critical data literacy and its foundations among participating students. We describe our definitions of data literacy and critical data literacy and discuss curriculum strategies to develop them. 
    more » « less
  3. null (Ed.)
    To develop code that meets its specification and is verifiably correct, such as in a software engineering course, students must be able to understand formal contracts and annotate their code with assertions such as loop invariants. To assist in developing suitable instructor and automated tool interventions, this research aims to go beyond simple pre- and post-conditions and gain insight into student learning of loop invariants involving objects. As students develop suitable loop invariants for given code with the aid of an online system backed by a verification engine, each student attempt, either correct or incorrect, was collected and analyzed automatically, and catalogued using an iterative process to capture common difficulties. Students were also asked to explain their thought process in arriving at their answer for each submission. The collected explanations were analyzed manually and found to be useful to assess their level of understanding as well as to extract actionable information for instructors and automated tutoring systems. Qualitative conclusions include the impact of the medium. 
    more » « less
  4. Object-based development using design-by-contract (DbC) is broadly taught and practiced. Students must be able to read and write symbolic DbC assertions that are sufficiently precise and be able to use these assertions to trace program code. This paper summarizes the results of using an automated tool to pinpoint fine-grain difficulties students face in learning to symbolically trace code involving objects. The pilots were conducted in an undergraduate software engineering course. Quantitative results show that data collected by the tool can help to identify and classify learning obstacles. Qualitative findings help validate student misunderstandings underlying these difficulties. Analysis of exam questions helps understand the persistence of student learning to read and write simple assertions about code behavior. Together, these results provide directions for intervention. 
    more » « less
  5. ABSTRACT As a validated assessment, the Microbiology for Health Sciences Concept Inventory (MHSCI) is a valuable tool to evaluate student progress in health sciences microbiology courses. In this brief analysis, we survey MHSCI faculty users and report student MHSCI scores to determine the impact on student learning gains of the COVID-19 pandemic and subsequent quarantine in spring 2020. Although a majority of students reported moving to a fully online lecture and lab microbiology course in the spring 2020 semester, there was no statistically significant impact on student outcomes reported by the MHSCI, and by some measures, student learning gains increased in the semester students moved to online learning. Further research is necessary to determine the continuing impact of online lecture/lab courses on student outcomes on the MHSCI. Our analysis of data from spring 2020 shows that the MHSCI is still a statistically reliable measure of student misconceptions and overall difficulty scores for each item on the MHSCI was unchanged due to the pandemic. 
    more » « less