skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Eye Movement Biometrics Using a New Dataset Collected in Virtual Reality
This paper introduces a novel eye movement dataset collected in virtual reality (VR) that contains both 2D and 3D eye movement data from over 400 subjects. We establish that this dataset is suitable for biometric studies by evaluating it with both statistical and machine learning–based approaches. For comparison, we also include results from an existing, similarly constructed dataset.  more » « less
Award ID(s):
1714623
PAR ID:
10285745
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
ACM Symposium on Eye Tracking Research and Applications
Page Range / eLocation ID:
1 to 3
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Program comprehension is a vital skill in software development. This work investigates program comprehension by examining the eye movement of novice programmers as they gain programming experience over the duration of a Java course. Their eye movement behavior is compared to the eye movement of expert programmers. Eye movement studies of natural text show that word frequency and length influence eye movement duration and act as indicators of reading skill. The study uses an existing longitudinal eye tracking dataset with 20 novice and experienced readers of source code. The work investigates the acquisition of the effects of token frequency and token length in source code reading as an indication of program reading skill. The results show evidence of the frequency and length effects in reading source code and the acquisition of these effects by novices. These results are then leveraged in a machine learning model demonstrating how eye movement can be used to estimate programming proficiency and classify novices from experts with 72% accuracy. 
    more » « less
  2. Recent advances in eye tracking have given birth to a new genre of gaze-based context sensing applications, ranging from cognitive load estimation to emotion recognition. To achieve state-of-the-art recognition accuracy, a large-scale, labeled eye movement dataset is needed to train deep learning-based classifiers. However, due to the heterogeneity in human visual behavior, as well as the labor-intensive and privacy-compromising data collection process, datasets for gaze-based activity recognition are scarce and hard to collect. To alleviate the sparse gaze data problem, we present EyeSyn, a novel suite of psychology-inspired generative models that leverages only publicly available images and videos to synthesize a realistic and arbitrarily large eye movement dataset. Taking gaze-based museum activity recognition as a case study, our evaluation demonstrates that EyeSyn can not only replicate the distinct pat-terns in the actual gaze signals that are captured by an eye tracking device, but also simulate the signal diversity that results from different measurement setups and subject heterogeneity. Moreover, in the few-shot learning scenario, EyeSyn can be readily incorporated with either transfer learning or meta-learning to achieve 90% accuracy, without the need for a large-scale dataset for training. 
    more » « less
  3. Summary Previous research has linked rapid eye movement sleep to emotional processing, particularly stress. Lab studies indicate that rapid eye movement sleep deprivation and fragmentation heighten emotional reactivity and stress response. This relationship extends to natural settings, where poor‐quality sleep among college students correlates with increased academic stress and lower academic performance. However, there is a lack of research into how specific sleep stages, like rapid eye movement, affect real‐life stress development. This study investigated whether habitual rapid eye movement sleep in college students can predict the future development of real‐life stress symptoms associated with final exams. Fifty‐two participants (mean age = 19 years, 62% females) monitored their sleep for a week during the academic semester using a mobile electroencephalogram device, and then completed self‐evaluations measuring test anxiety and other relevant factors. They completed the same evaluations again just prior to final exams. We found that rapid eye movement sleep was the most dominant factor predicting changes in participants' test anxiety. However, contrasting with our predictions, habitual rapid eye movement sleep was associated with an increase rather than decrease in anxiety. We discuss these results in terms of the rapid eye movement recalibration hypothesis, which suggests rapid eye movement sleep modulates activity in stress‐encoding areas in the brain, leading to both decreased sensitivity and increased selectivity of stress responses. 
    more » « less
  4. Abstract This manuscript presents GazeBase, a large-scale longitudinal dataset containing 12,334 monocular eye-movement recordings captured from 322 college-aged participants. Participants completed a battery of seven tasks in two contiguous sessions during each round of recording, including a – (1) fixation task, (2) horizontal saccade task, (3) random oblique saccade task, (4) reading task, (5/6) free viewing of cinematic video task, and (7) gaze-driven gaming task. Nine rounds of recording were conducted over a 37 month period, with participants in each subsequent round recruited exclusively from prior rounds. All data was collected using an EyeLink 1000 eye tracker at a 1,000 Hz sampling rate, with a calibration and validation protocol performed before each task to ensure data quality. Due to its large number of participants and longitudinal nature, GazeBase is well suited for exploring research hypotheses in eye movement biometrics, along with other applications applying machine learning to eye movement signal analysis. Classification labels produced by the instrument’s real-time parser are provided for a subset of GazeBase, along with pupil area. 
    more » « less
  5. null (Ed.)
    Studies of eye movements during source code reading have supported the idea that reading source code differs fundamentally from reading natural text. The paper analyzed an existing data set of natural language and source code eye movement data using the E-Z reader model of eye movement control. The results show that the E-Z reader model can be used with natural text and with source code where it provides good predictions of eye movement duration. This result is confirmed by comparing model predictions to eye-movement data from this experiment and calculating the correlation score for each metric. Finally, it was found that gaze duration is influenced by token frequency in code and in natural text. The frequency effect is less pronounced on first fixation duration and single fixation duration. An eye movement control model for source code reading may open the door for tools in education and the industry to enhance program comprehension. 
    more » « less