skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Eye Movement and Pupil Measures: A Review
Our subjective visual experiences involve complex interaction between our eyes, our brain, and the surrounding world. It gives us the sense of sight, color, stereopsis, distance, pattern recognition, motor coordination, and more. The increasing ubiquity of gaze-aware technology brings with it the ability to track gaze and pupil measures with varying degrees of fidelity. With this in mind, a review that considers the various gaze measures becomes increasingly relevant, especially considering our ability to make sense of these signals given different spatio-temporal sampling capacities. In this paper, we selectively review prior work on eye movements and pupil measures. We first describe the main oculomotor events studied in the literature, and their characteristics exploited by different measures. Next, we review various eye movement and pupil measures from prior literature. Finally, we discuss our observations based on applications of these measures, the benefits and practical challenges involving these measures, and our recommendations on future eye-tracking research directions.  more » « less
Award ID(s):
2045523
PAR ID:
10313299
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
Frontiers in Computer Science
Volume:
3
ISSN:
2624-9898
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Eye tracking has already made its way to current commercial wearable display devices, and is becoming increasingly important for virtual and augmented reality applications. However, the existing model-based eye tracking solutions are not capable of conducting very accurate gaze angle measurements, and may not be sufficient to solve challenging display problems such as pupil steering or eyebox expansion. In this paper, we argue that accurate detection and localization of pupil in 3D space is a necessary intermediate step in model-based eye tracking. Existing methods and datasets either ignore evaluating the accuracy of 3D pupil localization or evaluate it only on synthetic data. To this end, we capture the first 3D pupilgaze-measurement dataset using a high precision setup with head stabilization and release it as the first benchmark dataset to evaluate both 3D pupil localization and gaze tracking methods. Furthermore, we utilize an advanced eye model to replace the commonly used oversimplified eye model. Leveraging the eye model, we propose a novel 3D pupil localization method with a deep learning-based corneal refraction correction. We demonstrate that our method outperforms the state-of-the-art works by reducing the 3D pupil localization error by 47.5% and the gaze estimation error by 18.7%. Our dataset and codes can be found here: link. 
    more » « less
  2. Algorithms for the estimation of gaze direction from mobile and video-based eye trackers typically involve tracking a feature of the eye that moves through the eye camera image in a way that covaries with the shifting gaze direction, such as the center or boundaries of the pupil. Tracking these features using traditional computer vision techniques can be difficult due to partial occlusion and environmental reflections. Although recent efforts to use machine learning (ML) for pupil tracking have demonstrated superior results when evaluated using standard measures of segmentation performance, little is known of how these networks may affect the quality of the final gaze estimate. This work provides an objective assessment of the impact of several contemporary ML-based methods for eye feature tracking when the subsequent gaze estimate is produced using either feature-based or model-based methods. Metrics include the accuracy and precision of the gaze estimate, as well as drop-out rate. 
    more » « less
  3. Researchers have been employing psycho-physiological measures to better understand program comprehension, for example simultaneous fMRI and eye tracking to validate top-down comprehension models. In this paper, we argue that there is additional value in eye-tracking data beyond eye gaze: Pupil dilation and blink rates may offer insights into programmers' cognitive load. However, the fMRI environment may influence pupil dilation and blink rates, which would diminish their informative value. We conducted a preliminary analysis of pupil dilation and blink rates of an fMRI experiment with 22 student participants. We conclude from our preliminary analysis that the correction for our fMRI environment is challenging, but possible, such that we can use pupil dilation and blink rates to more reliably observe program comprehension. 
    more » « less
  4. Abstract Infants vary in their ability to follow others’ gazes, but it is unclear how these individual differences emerge. We tested whether social motivation levels in early infancy predict later gaze following skills. We longitudinally tracked infants’ (N = 82) gazes and pupil dilation while they observed videos of a woman looking into the camera simulating eye contact (i.e., mutual gaze) and then gazing toward one of two objects, at 2, 4, 6, 8, and 14 months of age. To improve measurement validity, we used confirmatory factor analysis to combine multiple observed measures to index the underlying constructs of social motivation and gaze following. Infants’ social motivation—indexed by their speed of social orienting, duration of mutual gaze, and degree of pupil dilation during mutual gaze—was developmentally stable and positively predicted the development of gaze following—indexed by their proportion of time looking to the target object, first object look difference scores, and first face‐to‐object saccade difference scores—from 6 to 14 months of age. These findings suggest that infants’ social motivation likely plays a role in the development of gaze following and highlight the use of a multi‐measure approach to improve measurement sensitivity and validity in infancy research. 
    more » « less
  5. null (Ed.)
    We present a personalized, comprehensive eye-tracking solution based on tracking higher-order Purkinje images, suited explicitly for eyeglasses-style AR and VR displays. Existing eye-tracking systems for near-eye applications are typically designed to work for an on-axis configuration and rely on pupil center and corneal reflections (PCCR) to estimate gaze with an accuracy of only about 0.5°to 1°. These are often expensive, bulky in form factor, and fail to estimate monocular accommodation, which is crucial for focus adjustment within the AR glasses. Our system independently measures the binocular vergence and monocular accommodation using higher-order Purkinje reflections from the eye, extending the PCCR based methods. We demonstrate that these reflections are sensitive to both gaze rotation and lens accommodation and model the Purkinje images’ behavior in simulation. We also design and fabricate a user-customized eye tracker using cheap off-the-shelf cameras and LEDs. We use an end-to-end convolutional neural network (CNN) for calibrating the eye tracker for the individual user, allowing for robust and simultaneous estimation of vergence and accommodation. Experimental results show that our solution, specifically catering to individual users, outperforms state-of-the-art methods for vergence and depth estimation, achieving an accuracy of 0.3782°and 1.108 cm respectively. 
    more » « less