Understanding how individuals focus and perform visual searches during collaborative tasks can help improve user engagement. Eye tracking measures provide informative cues for such understanding. This article presents A-DisETrac, an advanced analytic dashboard for distributed eye tracking. It uses off-the-shelf eye trackers to monitor multiple users in parallel, compute both traditional and advanced gaze measures in real-time, and display them on an interactive dashboard. Using two pilot studies, the system was evaluated in terms of user experience and utility, and compared with existing work. Moreover, the system was used to study how advanced gaze measures such as ambient-focal coefficient K and real-time index of pupillary activity relate to collaborative behavior. It was observed that the time a group takes to complete a puzzle is related to the ambient visual scanning behavior quantified and groups that spent more time had more scanning behavior. User experience questionnaire results suggest that their dashboard provides a comparatively good user experience.
more »
« less
DisETrac: Distributed Eye-Tracking for Online Collaboration
Coordinating viewpoints with another person during a collaborative task can provide informative cues on human behavior. Despite the massive shift of collaborative spaces into virtual environments, versatile setups that enable eye-tracking in an online collaborative environment (distributed eye-tracking) remain unexplored. In this study, we present DisETrac- a versatile setup for eye-tracking in online collaborations. Further, we demonstrate and evaluate the utility of DisETrac through a user study. Finally, we discuss the implications of our results for future improvements. Our results indicate promising avenue for developing versatile setups for distributed eye-tracking.
more »
« less
- Award ID(s):
- 2045523
- PAR ID:
- 10402995
- Date Published:
- Journal Name:
- Proceedings of the 2023 Conference on Human Information Interaction and Retrieval
- Page Range / eLocation ID:
- 427 to 431
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Recent advances in eye tracking have given birth to a new genre of gaze-based context sensing applications, ranging from cognitive load estimation to emotion recognition. To achieve state-of-the-art recognition accuracy, a large-scale, labeled eye movement dataset is needed to train deep learning-based classifiers. However, due to the heterogeneity in human visual behavior, as well as the labor-intensive and privacy-compromising data collection process, datasets for gaze-based activity recognition are scarce and hard to collect. To alleviate the sparse gaze data problem, we present EyeSyn, a novel suite of psychology-inspired generative models that leverages only publicly available images and videos to synthesize a realistic and arbitrarily large eye movement dataset. Taking gaze-based museum activity recognition as a case study, our evaluation demonstrates that EyeSyn can not only replicate the distinct pat-terns in the actual gaze signals that are captured by an eye tracking device, but also simulate the signal diversity that results from different measurement setups and subject heterogeneity. Moreover, in the few-shot learning scenario, EyeSyn can be readily incorporated with either transfer learning or meta-learning to achieve 90% accuracy, without the need for a large-scale dataset for training.more » « less
-
We introduce WebGazer, an online eye tracker that uses common webcams already present in laptops and mobile devices to infer the eye-gaze locations of web visitors on a page in real time. The eye tracking model self-calibrates by watching web visitors interact with the web page and trains a mapping between features of the eye and positions on the screen. This approach aims to provide a natural experience to everyday users that is not restricted to laboratories and highly controlled user studies. WebGazer has two key components: a pupil detector that can be combined with any eye detection library, and a gaze estimator using regression analysis informed by user interactions. We perform a large remote online study and a small in-person study to evaluate WebGazer. The findings show that WebGazer can learn from user interactions and that its accuracy is sufficient for approximating the user's gaze. As part of this paper, we release the first eye tracking library that can be easily integrated in any website for real-time gaze interactions, usability studies, or web research.more » « less
-
Online learning has gained increased popularity in recent years. However, with online learning, teacher observation and intervention is lost, creating a need for technologically observable characteristics that can compensate for this limitation. The present study used a wide array of sensing mechanisms including eye tracking, galvanic skin response (GSR) recording, facial expression analysis, and summary note-taking to monitor participants while they watched and recalled an online video lecture. We explored the link between these human-elicited responses and learning outcomes as measured by quiz questions. Results revealed GSR to be the best indicator of the challenge level of the lecture material. Yet, eye tracking and GSR remain difficult to capture when monitoring online learning as the requirement to remain still impacts natural behavior and leads to more stoic and unexpressive faces. Continued work on methods ensuring naturalistic capture are critical for broadening the use of sensor technology in online learning, as are ways to fuse these data with other input, such as structured and unstructured data from peer-to-peer or student-teacher interactions.more » « less
-
null (Ed.)Experienced teachers pay close attention to their students, adjusting their teaching when students seem lost. This dynamic interaction is missing in online education. We hypothesized that attentive students follow videos similarly with their eyes. Thus, attention to instructional videos could be assessed remotely by tracking eye movements. Here we show that intersubject correlation of eye movements during video presentation is substantially higher for attentive students and that synchronized eye movements are predictive of individual test scores on the material presented in the video. These findings replicate for videos in a variety of production styles, for incidental and intentional learning and for recall and comprehension questions alike. We reproduce the result using standard web cameras to capture eye movements in a classroom setting and with over 1,000 participants at home without the need to transmit user data. Our results suggest that online education could be made adaptive to a student’s level of attention in real time.more » « less