In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section “An empirically based minimal reporting guideline”).
more »
« less
Feasibility of Longitudinal Eye-Gaze Tracking in the Workplace
Eye movements provide a window into cognitive processes, but much of the research harnessing this data has been confined to the laboratory. We address whether eye gaze can be passively, reliably, and privately recorded in real-world environments across extended timeframes using commercial-off-the-shelf (COTS) sensors. We recorded eye gaze data from a COTS tracker embedded in participants (N=20) work environments at pseudorandom intervals across a two-week period. We found that valid samples were recorded approximately 30% of the time despite calibrating the eye tracker only once and without placing any other restrictions on participants. The number of valid samples decreased over days with the degree of decrease dependent on contextual variables (i.e., frequency of video conferencing) and individual difference attributes (e.g., sleep quality and multitasking ability). Participants reported that sensors did not change or impact their work. Our findings suggest the potential for the collection of eye-gaze in authentic environments.
more »
« less
- Award ID(s):
- 1920510
- PAR ID:
- 10349941
- Date Published:
- Journal Name:
- Proceedings of the ACM on Human-Computer Interaction
- Volume:
- 6
- Issue:
- ETRA
- ISSN:
- 2573-0142
- Page Range / eLocation ID:
- 1 to 21
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Evaluating programming proficiency has become more relevant as the demand for coding skills has increased. Current methods, such as questionnaires or interviews, are methods that lack intuition, flexibility, and real-time capabilities. In our work, we investigate eye gaze behavior as an estimate for skill assessment. Specifically, we conducted a study (N=14) using an eye tracker to analyze the participants' abilities to understand source code by presenting them with a series of programs. We evaluated their eye movements based on common eye tracking metrics and identified mutual task-solving strategies among the participants. While we cannot relate these indicators to programming proficiency directly, this study serves as an evaluation of real-time methods for evaluating programming proficiency.more » « less
-
Improved vergence and accommodation via Purkinje Image tracking with multiple cameras for AR glassesnull (Ed.)We present a personalized, comprehensive eye-tracking solution based on tracking higher-order Purkinje images, suited explicitly for eyeglasses-style AR and VR displays. Existing eye-tracking systems for near-eye applications are typically designed to work for an on-axis configuration and rely on pupil center and corneal reflections (PCCR) to estimate gaze with an accuracy of only about 0.5°to 1°. These are often expensive, bulky in form factor, and fail to estimate monocular accommodation, which is crucial for focus adjustment within the AR glasses. Our system independently measures the binocular vergence and monocular accommodation using higher-order Purkinje reflections from the eye, extending the PCCR based methods. We demonstrate that these reflections are sensitive to both gaze rotation and lens accommodation and model the Purkinje images’ behavior in simulation. We also design and fabricate a user-customized eye tracker using cheap off-the-shelf cameras and LEDs. We use an end-to-end convolutional neural network (CNN) for calibrating the eye tracker for the individual user, allowing for robust and simultaneous estimation of vergence and accommodation. Experimental results show that our solution, specifically catering to individual users, outperforms state-of-the-art methods for vergence and depth estimation, achieving an accuracy of 0.3782°and 1.108 cm respectively.more » « less
-
Fogt, Nick (Ed.)Primates can rapidly detect potential predators and modify their behavior based on the level of risk. The gaze direction of predators is one feature that primates can use to assess risk levels: recognition of a predator’s direct stare indicates to prey that it has been detected and the level of risk is relatively high. Predation has likely shaped visual attention in primates to quickly assess the level of risk but we know little about the constellation of low-level (e.g., contrast, color) and higher-order (e.g., category membership, perceived threat) visual features that primates use to do so. We therefore presented human and chimpanzee (Pan troglodytes) participants with photographs of potential predators (lions) and prey (impala) while we recorded their overt attention with an eye-tracker. The gaze of the predators and prey was either directed or averted. We found that both humans and chimpanzees visually fixated the eyes of predators more than those of prey. In addition, they directed the most attention toward the eyes of directed (rather than averted) predators. Humans, but not chimpanzees, gazed at the eyes of the predators and prey more than other features. Importantly, low-level visual features of the predators and prey did not provide a good explanation of the observed gaze patterns.more » « less
-
Abstract This manuscript presents GazeBase, a large-scale longitudinal dataset containing 12,334 monocular eye-movement recordings captured from 322 college-aged participants. Participants completed a battery of seven tasks in two contiguous sessions during each round of recording, including a – (1) fixation task, (2) horizontal saccade task, (3) random oblique saccade task, (4) reading task, (5/6) free viewing of cinematic video task, and (7) gaze-driven gaming task. Nine rounds of recording were conducted over a 37 month period, with participants in each subsequent round recruited exclusively from prior rounds. All data was collected using an EyeLink 1000 eye tracker at a 1,000 Hz sampling rate, with a calibration and validation protocol performed before each task to ensure data quality. Due to its large number of participants and longitudinal nature, GazeBase is well suited for exploring research hypotheses in eye movement biometrics, along with other applications applying machine learning to eye movement signal analysis. Classification labels produced by the instrument’s real-time parser are provided for a subset of GazeBase, along with pupil area.more » « less
An official website of the United States government

