skip to main content


Title: Eye Movement and Pupil Measures: A Review
Our subjective visual experiences involve complex interaction between our eyes, our brain, and the surrounding world. It gives us the sense of sight, color, stereopsis, distance, pattern recognition, motor coordination, and more. The increasing ubiquity of gaze-aware technology brings with it the ability to track gaze and pupil measures with varying degrees of fidelity. With this in mind, a review that considers the various gaze measures becomes increasingly relevant, especially considering our ability to make sense of these signals given different spatio-temporal sampling capacities. In this paper, we selectively review prior work on eye movements and pupil measures. We first describe the main oculomotor events studied in the literature, and their characteristics exploited by different measures. Next, we review various eye movement and pupil measures from prior literature. Finally, we discuss our observations based on applications of these measures, the benefits and practical challenges involving these measures, and our recommendations on future eye-tracking research directions.  more » « less
Award ID(s):
2045523
NSF-PAR ID:
10313299
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
Frontiers in Computer Science
Volume:
3
ISSN:
2624-9898
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Infants vary in their ability to follow others’ gazes, but it is unclear how these individual differences emerge. We tested whether social motivation levels in early infancy predict later gaze following skills. We longitudinally tracked infants’ (N = 82) gazes and pupil dilation while they observed videos of a woman looking into the camera simulating eye contact (i.e., mutual gaze) and then gazing toward one of two objects, at 2, 4, 6, 8, and 14 months of age. To improve measurement validity, we used confirmatory factor analysis to combine multiple observed measures to index the underlying constructs of social motivation and gaze following. Infants’ social motivation—indexed by their speed of social orienting, duration of mutual gaze, and degree of pupil dilation during mutual gaze—was developmentally stable and positively predicted the development of gaze following—indexed by their proportion of time looking to the target object, first object look difference scores, and first face‐to‐object saccade difference scores—from 6 to 14 months of age. These findings suggest that infants’ social motivation likely plays a role in the development of gaze following and highlight the use of a multi‐measure approach to improve measurement sensitivity and validity in infancy research.

     
    more » « less
  2. Background

    In Physical Human–Robot Interaction (pHRI), the need to learn the robot’s motor-control dynamics is associated with increased cognitive load. Eye-tracking metrics can help understand the dynamics of fluctuating mental workload over the course of learning.

    Objective

    The aim of this study was to test eye-tracking measures’ sensitivity and reliability to variations in task difficulty, as well as their performance-prediction capability, in physical human–robot collaboration tasks involving an industrial robot for object comanipulation.

    Methods

    Participants (9M, 9F) learned to coperform a virtual pick-and-place task with a bimanual robot over multiple trials. Joint stiffness of the robot was manipulated to increase motor-coordination demands. The psychometric properties of eye-tracking measures and their ability to predict performance was investigated.

    Results

    Stationary Gaze Entropy and pupil diameter were the most reliable and sensitive measures of workload associated with changes in task difficulty and learning. Increased task difficulty was more likely to result in a robot-monitoring strategy. Eye-tracking measures were able to predict the occurrence of success or failure in each trial with 70% sensitivity and 71% accuracy.

    Conclusion

    The sensitivity and reliability of eye-tracking measures was acceptable, although values were lower than those observed in cognitive domains. Measures of gaze behaviors indicative of visual monitoring strategies were most sensitive to task difficulty manipulations, and should be explored further for the pHRI domain where motor-control and internal-model formation will likely be strong contributors to workload.

    Application

    Future collaborative robots can adapt to human cognitive state and skill-level measured using eye-tracking measures of workload and visual attention.

     
    more » « less
  3. Background. Simulation has revolutionized teaching and learning. However, traditional manikins are limited in their ability to exhibit emotions, movements, and interactive eye gaze. As a result, students struggle with immersion and may be unable to authentically relate to the patient. Intervention. We developed a new type of patient simulator called the Physical-Virtual Patients (PVP) which combines the physicality of manikins with the richness of dynamic visuals. The PVP uses spatial Augmented Reality to rear project dynamic imagery (e.g., facial expressions, ptosis, pupil reactions) on a semi-transparent physical shell. The shell occupies space and matches the dimensions of a human head. Methods. We compared two groups of third semester nursing students (N=59) from a baccalaureate program using a between-participant design, one group interacting with a traditional high-fidelity manikin versus a more realistic PVP head. The learners had to perform a neurological assessment. We measured authenticity, urgency, and learning. Results. Learners had a more realistic encounter with the PVP patient (p=0.046), they were more engaged with the PVP condition compared to the manikin in terms of authenticity of encounter and cognitive strategies. The PVP provoked a higher sense of urgency (p=0.002). There was increased learning for the PVP group compared to the manikin group on the pre and post-simulation scores (p=0.027). Conclusion. The realism of the visuals in the PVP increases authenticity and engagement which results in a greater sense of urgency and overall learning. 
    more » « less
  4. Researchers have been employing psycho-physiological measures to better understand program comprehension, for example simultaneous fMRI and eye tracking to validate top-down comprehension models. In this paper, we argue that there is additional value in eye-tracking data beyond eye gaze: Pupil dilation and blink rates may offer insights into programmers' cognitive load. However, the fMRI environment may influence pupil dilation and blink rates, which would diminish their informative value. We conducted a preliminary analysis of pupil dilation and blink rates of an fMRI experiment with 22 student participants. We conclude from our preliminary analysis that the correction for our fMRI environment is challenging, but possible, such that we can use pupil dilation and blink rates to more reliably observe program comprehension. 
    more » « less
  5. null (Ed.)
    We present a personalized, comprehensive eye-tracking solution based on tracking higher-order Purkinje images, suited explicitly for eyeglasses-style AR and VR displays. Existing eye-tracking systems for near-eye applications are typically designed to work for an on-axis configuration and rely on pupil center and corneal reflections (PCCR) to estimate gaze with an accuracy of only about 0.5°to 1°. These are often expensive, bulky in form factor, and fail to estimate monocular accommodation, which is crucial for focus adjustment within the AR glasses. Our system independently measures the binocular vergence and monocular accommodation using higher-order Purkinje reflections from the eye, extending the PCCR based methods. We demonstrate that these reflections are sensitive to both gaze rotation and lens accommodation and model the Purkinje images’ behavior in simulation. We also design and fabricate a user-customized eye tracker using cheap off-the-shelf cameras and LEDs. We use an end-to-end convolutional neural network (CNN) for calibrating the eye tracker for the individual user, allowing for robust and simultaneous estimation of vergence and accommodation. Experimental results show that our solution, specifically catering to individual users, outperforms state-of-the-art methods for vergence and depth estimation, achieving an accuracy of 0.3782°and 1.108 cm respectively. 
    more » « less