Title: Analyzing debugging processes during collaborative, computational modeling in science
This paper develops a systematic approach to identifying and analyzing high school students’ debugging strategies when they work together to construct computational models of scientific processes in a block-based programming environment. We combine Markov models derived from students’ activity logs with epistemic network analysis of their collaborative discourse to interpret and analyze their model building and debugging processes. We present a contrasting case study that illustrates the differences in debugging strategies between two groups of students and its impact on their model-building effectiveness. more »« less
This paper develops a systematic approach to identifying and analyzing high school students’ debugging strategies when they work together to construct computational models of scientific processes in a block-based programming environment. We combine Markov models derived from students’ activity logs with epistemic network analysis of their collaborative discourse to interpret and analyze their model building and debugging processes. We present a contrasting case study that illustrates the differences in debugging strategies between two groups of students and its impact on their model-building effectiveness.
Snyder, C.; Biswas, G.; Emara, M.; Grover, S.; & Conlin, L.
(, Computer-supported collaborative learning)
The introduction of computational modeling into science curricula has been shown to benefit students’ learning, however the synergistic learning processes that contribute to these benefits are not fully understood. We study students’ synergistic learning of physics and computational thinking (CT) through their actions and collaborative discourse as they develop computational models in a visual block-structured environment. We adopt a case study approach to analyze students synergistic learning processes related to stopping conditions, initialization, and debugging episodes. Our findings show a pattern of evolving sophistication in synergistic reasoning for model-building activities.
Bowers, Jonathan; Eidin, Emanuel; Stephens, Lynn; Brennan, Linsey
(, Journal of Science Education and Technology)
Abstract Interpreting and creating computational systems models is an important goal of science education. One aspect of computational systems modeling that is supported by modeling, systems thinking, and computational thinking literature is “testing, evaluating, and debugging models.” Through testing and debugging, students can identify aspects of their models that either do not match external data or conflict with their conceptual understandings of a phenomenon. This disconnect encourages students to make model revisions, which in turn deepens their conceptual understanding of a phenomenon. Given that many students find testing and debugging challenging, we set out to investigate the various testing and debugging behaviors and behavioral patterns that students use when building and revising computational system models in a supportive learning environment. We designed and implemented a 6-week unit where students constructed and revised a computational systems model of evaporative cooling using SageModeler software. Our results suggest that despite being in a common classroom, the three groups of students in this study all utilized different testing and debugging behavioral patterns. Group 1 focused on using external peer feedback to identify flaws in their model, group 2 used verbal and written discourse to critique their model’s structure and suggest structural changes, and group 3 relied on systemic analysis of model output to drive model revisions. These results suggest that multiple aspects of the learning environment are necessary to enable students to take these different approaches to testing and debugging.
Snyder, C.; Hutchins, N.; Biswas, G.; & Grover, S.
(, International Conference on Artificial Intelligence in Education (AIED) 2019.)
The benefits of computational model building in STEM domains are well documented yet the synergistic learning processes that lead to the effective learning gains are not fully understood. In this paper, we analyze the discussions between students working collaboratively to build computational models to solve physics problems. From this collaborative discourse, we identify strategies that impact their model building and learning processes.
Mostowfi, Sara; Kim, Jung Hyup; Mohanty, Siddarth; Wang, Fang; Oprean, Danielle; Wang, Yi; Seo, Kangwon
(, Proceedings of the Human Factors and Ergonomics Society Annual Meeting)
Debugging process plays a crucial role in helping students pinpoint their specific learning weaknesses, allowing them to modify their strategies for enhanced academic performance. Notably, changes in pupil dilation serve as an indicator of arousal associated with confronting learning challenges. This physiological response acts as a “physiological footprint” that reflects cognitive engagement, facilitating internally focused cognitive processes such as insight generation and mind-wandering. In this study, we proposed that pupil dilation could be a valuable predictor of students’ metacognitive awareness throughout the debugging process, specifically within an augmented reality (AR) learning environment. The findings revealed significant differences in pupil dilation among students categorized by their varying levels of debugging, which represents a specific dimension of the Metacognitive Awareness Inventory.
Hutchins, N.M., Snyder, C., Emara, M., Grover, S., and Biswas, G. Analyzing debugging processes during collaborative, computational modeling in science. Retrieved from https://par.nsf.gov/biblio/10298755. Computersupported collaborative learning .
Hutchins, N.M., Snyder, C., Emara, M., Grover, S., and Biswas, G.
"Analyzing debugging processes during collaborative, computational modeling in science". Computersupported collaborative learning (). Country unknown/Code not available. https://par.nsf.gov/biblio/10298755.
@article{osti_10298755,
place = {Country unknown/Code not available},
title = {Analyzing debugging processes during collaborative, computational modeling in science},
url = {https://par.nsf.gov/biblio/10298755},
abstractNote = {This paper develops a systematic approach to identifying and analyzing high school students’ debugging strategies when they work together to construct computational models of scientific processes in a block-based programming environment. We combine Markov models derived from students’ activity logs with epistemic network analysis of their collaborative discourse to interpret and analyze their model building and debugging processes. We present a contrasting case study that illustrates the differences in debugging strategies between two groups of students and its impact on their model-building effectiveness.},
journal = {Computersupported collaborative learning},
author = {Hutchins, N.M. and Snyder, C. and Emara, M. and Grover, S. and Biswas, G.},
editor = {Hmelo-Silver, C. E.}
}
Warning: Leaving National Science Foundation Website
You are now leaving the National Science Foundation website to go to a non-government website.
Website:
NSF takes no responsibility for and exercises no control over the views expressed or the accuracy of
the information contained on this site. Also be aware that NSF's privacy policy does not apply to this site.