This study aims to develop an interactive learning solution for engineering education by combining augmented reality (AR), Near-Field Electromagnetic Ranging (NFER), and motion capture technologies. We built an instructional system that integrates AR devices and real-time positioning sensors to improve the interactive experience of learners in an immersive learning environment, while the motion, eye-tracking, and location-tracking data collected by the devices applied to learners enable instructors to understand their learning patterns. To test the usability of the system, two AR-based lectures were developed with different difficulty levels (Lecture 1 - Easy vs. Lecture 2 - Hard), and the System Usability Scale (SUS) was collected from thirty participants. We did not observe a significant usability difference between Lecture 1 and Lecture 2. Through the experiment, we demonstrated the robustness of this AR learning system and its unique promise in integrating AR teaching with other technologies.
more »
« less
Sensor-based Methodological Observations for Studying Online Learning
Online learning has gained increased popularity in recent years. However, with online learning, teacher observation and intervention is lost, creating a need for technologically observable characteristics that can compensate for this limitation. The present study used a wide array of sensing mechanisms including eye tracking, galvanic skin response (GSR) recording, facial expression analysis, and summary note-taking to monitor participants while they watched and recalled an online video lecture. We explored the link between these human-elicited responses and learning outcomes as measured by quiz questions. Results revealed GSR to be the best indicator of the challenge level of the lecture material. Yet, eye tracking and GSR remain difficult to capture when monitoring online learning as the requirement to remain still impacts natural behavior and leads to more stoic and unexpressive faces. Continued work on methods ensuring naturalistic capture are critical for broadening the use of sensor technology in online learning, as are ways to fuse these data with other input, such as structured and unstructured data from peer-to-peer or student-teacher interactions.
more »
« less
- Award ID(s):
- 1559889
- PAR ID:
- 10042953
- Date Published:
- Journal Name:
- Proceedings of the 2017 ACM Workshop on Intelligent Interfaces for Ubiquitous and Smart Learning
- Page Range / eLocation ID:
- 25 to 30
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Experienced teachers pay close attention to their students, adjusting their teaching when students seem lost. This dynamic interaction is missing in online education. We hypothesized that attentive students follow videos similarly with their eyes. Thus, attention to instructional videos could be assessed remotely by tracking eye movements. Here we show that intersubject correlation of eye movements during video presentation is substantially higher for attentive students and that synchronized eye movements are predictive of individual test scores on the material presented in the video. These findings replicate for videos in a variety of production styles, for incidental and intentional learning and for recall and comprehension questions alike. We reproduce the result using standard web cameras to capture eye movements in a classroom setting and with over 1,000 participants at home without the need to transmit user data. Our results suggest that online education could be made adaptive to a student’s level of attention in real time.more » « less
-
Coordinating viewpoints with another person during a collaborative task can provide informative cues on human behavior. Despite the massive shift of collaborative spaces into virtual environments, versatile setups that enable eye-tracking in an online collaborative environment (distributed eye-tracking) remain unexplored. In this study, we present DisETrac- a versatile setup for eye-tracking in online collaborations. Further, we demonstrate and evaluate the utility of DisETrac through a user study. Finally, we discuss the implications of our results for future improvements. Our results indicate promising avenue for developing versatile setups for distributed eye-tracking.more » « less
-
Abstract Eye tracking provides direct, temporally and spatially sensitive measures of eye gaze. It can capture visual attention patterns from infancy through adulthood. However, commonly used screen-based eye tracking (SET) paradigms are limited in their depiction of how individuals process information as they interact with the environment in “real life”. Mobile eye tracking (MET) records participant-perspective gaze in the context of active behavior. Recent technological developments in MET hardware enable researchers to capture egocentric vision as early as infancy and across the lifespan. However, challenges remain in MET data collection, processing, and analysis. The present paper aims to provide an introduction and practical guide to starting researchers in the field to facilitate the use of MET in psychological research with a wide range of age groups. First, we provide a general introduction to MET. Next, we briefly review MET studies in adults and children that provide new insights into attention and its roles in cognitive and socioemotional functioning. We then discuss technical issues relating to MET data collection and provide guidelines for data quality inspection, gaze annotations, data visualization, and statistical analyses. Lastly, we conclude by discussing the future directions of MET implementation. Open-source programs for MET data quality inspection, data visualization, and analysis are shared publicly.more » « less
-
Abstract Eye tracking is becoming increasingly available in head-mounted virtual reality displays with various headsets with integrated eye trackers already commercially available. The applications of eye tracking in virtual reality are highly diversified and span multiple disciplines. As a result, the number of peer-reviewed publications that study eye tracking applications has surged in recent years. We performed a broad review to comprehensively search academic literature databases with the aim of assessing the extent of published research dealing with applications of eye tracking in virtual reality, and highlighting challenges, limitations and areas for future research.more » « less