When learning to code a student must learn both to create a program and then how to debug said program. Novices often start with print statements to help trace code execution and isolate logical errors. Eventually, they adopt advance debugger practices such as breakpoints, "stepping" through code execution, and "watching" variables as their values are updated. Unfortunately for students working with Arduino devices, there are no debugger tools built into the Arduino IDE. Instead, a student would have to move onto a professional IDE like Atmel Studio and/or acquire a hardware debugger. Except, these options have a steep learning curve and are not intended for a student who has just started to learn how to write code. I am developing an Arduino software library, called Pin Status, to assist novice programmers with debugging common logic errors and provide features specific to the e-textile microcontroller, Adafruit Circuit Playground Classic.
A Software Debugger for E-textiles and Arduino Microcontrollers
Today’s STEM classrooms have expanded the domain of computer
science education from a basic two-toned terminal screen to now
include helpful Integrated Development Environments(IDE) (BlueJ,
Eclipse), block-based programming (MIT Scratch, Greenfoot), and
even physical computing with embedded systems (Arduino, LEGO
Mindstorm). But no matter which environment a student starts
programming in, all students will eventually need help in finding
and fixing bugs in their code. While the helpful IDE’s have debugger
tools built in (breakpoints for pausing your program, ways to
view/modify variable values, and "stepping" through code execution),
in many of the other programming environments, students
are limited to using print statements to try and "see" what is happening
inside their program.
Most students who learn to write code for Arduino microcontrollers
will start within the Arduino IDE, but the official Arduino
IDE does not currently provide any debugging tools. Instead, a student
would have to move on to a professional IDE such as Atmel Studio
or acquire a hardware debugger in order to add breakpoints or
view their program’s variables. But each of these options has a steep
learning curve, additional costs, and can require complex configurations.
Based on research of student debugging practices[3, 7] and
our own classroom observations, we have developed an Arduino
software library, called Arduino Debugger, which provides some
of these debugging tools (ex. breakpoints) while staying more »
- Award ID(s):
- 1742081
- Publication Date:
- NSF-PAR ID:
- 10163651
- Journal Name:
- FabLearn 2020 - 9th Annual Conference on Maker Education (FabLearn ’20)
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Comprehending programs is key to learning programming. Previous studies highlight novices’ naive approaches to comprehend ing the structural, functional, and behavioral aspects of programs. And yet, with the majority of them examining on-screen program ming environments, we barely know about program comprehension within physical computing—a common K-12 programming context. In this study, we qualitatively analyzed think-aloud inter view videos of 22 high school students individually comprehending a given text-based Arduino program while interacting with its corresponding functional physical artifact to answer two questions: 1) How do novices comprehend the given text-based Arduino pro gram? And, 2) What role does the physical artifact play in program comprehension? We found that novices mostly approached the program bottom-up, initially comprehending structural and later functional aspects, along different granularities. The artifact provided two distinct modes of engagement, active and interactive, that supported the program’s structural and functional comprehension. However, behavioral comprehension i.e. understanding program execution leading to the observed outcome was inaccessible to many. Our findings extend program comprehension literature in two ways: (a) it provides one of the very few accounts of high school students’ code comprehension in a physical computing con text, and, (b) it highlights the mediating role of physical artifactsmore »
-
In September 2019, the fourth and final workshop on the Future of Mechatronics and Robotics Education (FoMRE) was held at a Lawrence Technological University in Southfield, MI. This workshop was organized by faculty at several universities with financial support from industry partners and the National Science Foundation. The purpose of the workshops was to create a cohesive effort among mechatronics and robotics courses, minors and degree programs. Mechatronics and Robotics Engineering (MRE) is an integration of mechanics, controls, electronics, and software, which provides a unique opportunity for engineering students to function on multidisciplinary teams. Due to its multidisciplinary nature, it attracts diverse and innovative students, and graduates better-prepared professional engineers. In this fast growing field, there is a great need to standardize educational material and make MRE education more widely available and easier to adopt. This can only be accomplished if the community comes together to speak with one clear voice about not only the benefits, but also the best ways to teach it. These efforts would also aid in establishing more of these degree programs and integrating minors or majors into existing computer science, mechanical engineering, or electrical engineering departments. The final workshop was attended by approximately 50 practitionersmore »
-
Novice programmers often struggle with code understanding and debugging. Live Programming environments visualize the runtime values of a program each time it is modified to provide immediate feedback, which help with tracing the program execution. This paper presents the use of a Live Programming tool in a CS1 course to better understand the impact of Live Programming on novices’ learning metrics and their perceptions of the tool. We conducted a within-subjects study at a large public university in a CS1 course in Python (N=237) where students completed tasks in a lab setting, in some cases with a Live Programming environment, and in some cases without. Through post-lab surveys and open-ended feedback, we measured how well students understood the material and how students perceived the programming environment. To understand the impact of Live Programming, we compared the collected data for students who used Live Programming with the data for students who did not. We found that while learning outcomes were the same regardless of whether Live Programming was used or not, students who used the Live Programming tool completed some code tracing tasks faster. Furthermore, students liked the Live Programming environment more, and rated it as more helpful for their learning.
-
Obeid, Iyad Selesnick (Ed.)Electroencephalography (EEG) is a popular clinical monitoring tool used for diagnosing brain-related disorders such as epilepsy [1]. As monitoring EEGs in a critical-care setting is an expensive and tedious task, there is a great interest in developing real-time EEG monitoring tools to improve patient care quality and efficiency [2]. However, clinicians require automatic seizure detection tools that provide decisions with at least 75% sensitivity and less than 1 false alarm (FA) per 24 hours [3]. Some commercial tools recently claim to reach such performance levels, including the Olympic Brainz Monitor [4] and Persyst 14 [5]. In this abstract, we describe our efforts to transform a high-performance offline seizure detection system [3] into a low latency real-time or online seizure detection system. An overview of the system is shown in Figure 1. The main difference between an online versus offline system is that an online system should always be causal and has minimum latency which is often defined by domain experts. The offline system, shown in Figure 2, uses two phases of deep learning models with postprocessing [3]. The channel-based long short term memory (LSTM) model (Phase 1 or P1) processes linear frequency cepstral coefficients (LFCC) [6] features from each EEGmore »