skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Estimating Perceptual Depth Changes with Eye Vergence and Interpupillary Distance using an Eye Tracker in Virtual Reality
Virtual Reality (VR) technology has advanced to include eye-tracking, allowing novel research, such as investigating how our visual system coordinates eye movements with changes in perceptual depth. The purpose of this study was to examine whether eye tracking could track perceptual depth changes during a visual discrimination task. We derived two depth-dependent variables from eye tracker data: eye vergence angle (EVA) and interpupillary distance (IPD). As hypothesized, our results revealed that shifting gaze from near-to-far depth significantly decreased EVA and increased IPD, while the opposite pattern was observed while shifting from far-to-near. Importantly, the amount of change in these variables tracked closely with relative changes in perceptual depth, and supported the hypothesis that eye tracker data may be used to infer real-time changes in perceptual depth in VR. Our method could be used as a new tool to adaptively render information based on depth and improve the VR user experience.  more » « less
Award ID(s):
1937565
PAR ID:
10390194
Author(s) / Creator(s):
; ; ;
Editor(s):
Blascheck, Tanja; Bradshaw, Jessica; Vrzakova, Hana
Date Published:
Journal Name:
ACM Symposium on Eye Tracking Research and Applications
Page Range / eLocation ID:
1 to 7
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Immersive, interactive virtual reality (VR) experiences rely on eye tracking data for a variety of applications. However, eye trackers assume that the user's eyes move in a coordinated way. We investigate how the violation of this assumption impacts the performance and subjective experience of users with strabismus and amblyopia. Our investigation follows a case study approach by analyzing in depth the qualitative and quantitative data collected during an interactive VR game by a small number of users with these visual impairments. Our findings reveal the ways in which assumptions about the default functioning of the eye can discourage or even exclude otherwise enthusiastic users from immersive VR. This study thus opens a new frontier for eye tracking research and practice. 
    more » « less
  2. Interpupillary distance (IPD) is the most important parameter for creating a user-specific stereo parallax, which in turn is crucial for correct depth perception. This is why contemporary Head-Mounted Displays (HMDs) offer adjustable lenses to adapt to users’ individual IPDs. However, today’s Video See-Through Augmented Reality (VST AR) HMDs use fixed camera placements to reconstruct the stereoscopic view of a user’s environment. This leads to a potential mismatch between individual IPD settings and the fixed Inter-Camera Distances (ICD), which can lead to perceptual incongruencies, limiting the usability and, potentially, the applicability of VST AR in depth-sensitive use cases. To investigate this incongruency between IPD and ICD, we conducted a 2 × 3 mixed-factor design user study using a near-field, open-loop reaching task comparing distance judgments of Virtual Reality (VR) and VST AR. We also investigated changes in reaching performance via perceptual calibration by incorporating a feedback phase between pre- and post-phase conditions, with a particular focus on the influence of IPD-ICD differences. Our Linear Mixed Model (LMM) analysis showed a significant difference between VR and VST AR, an effect of IPD-ICD mismatch, and a combined effect of both factors. However, subjective measures showed no effect underlining the subconscious nature of the perception of VST AR. This novel insight and its consequences are discussed specifically for depth perception tasks in AR, eXtended Reality (XR), and potential use cases. 
    more » « less
  3. As virtual reality (VR) garners more attention for eye tracking research, knowledge of accuracy and precision of head-mounted display (HMD) based eye trackers becomes increasingly necessary. It is tempting to rely on manufacturer-provided information about the accuracy and precision of an eye tracker. However, unless data is collected under ideal conditions, these values seldom align with on-site metrics. Therefore, best practices dictate that accuracy and precision should be measured and reported for each study. To address this issue, we provide a novel open-source suite for rigorously measuring accuracy and precision for use with a variety of HMD-based eye trackers. This tool is customizable without having to alter the source code, but changes to the code allow for further alteration. The outputs are available in real time and easy to interpret, making eye tracking with VR more approachable for all users. 
    more » « less
  4. Emerging Virtual Reality (VR) displays with embedded eye trackers are currently becoming a commodity hardware (e.g., HTC Vive Pro Eye). Eye-tracking data can be utilized for several purposes, including gaze monitoring, privacy protection, and user authentication/identification. Identifying users is an integral part of many applications due to security and privacy concerns. In this paper, we explore methods and eye-tracking features that can be used to identify users. Prior VR researchers explored machine learning on motion-based data (such as body motion, head tracking, eye tracking, and hand tracking data) to identify users. Such systems usually require an explicit VR task and many features to train the machine learning model for user identification. We propose a system to identify users utilizing minimal eye-gaze-based features without designing any identification-specific tasks. We collected gaze data from an educational VR application and tested our system with two machine learning (ML) models, random forest (RF) and k-nearest-neighbors (kNN), and two deep learning (DL) models: convolutional neural networks (CNN) and long short-term memory (LSTM). Our results show that ML and DL models could identify users with over 98% accuracy with only six simple eye-gaze features. We discuss our results, their implications on security and privacy, and the limitations of our work. 
    more » « less
  5. We introduce SearchGazer, a web-based eye tracker for remote web search studies using common webcams already present in laptops and some desktop computers. SearchGazer is a pure JavaScript library that infers the gaze behavior of searchers in real time. The eye tracking model self-calibrates by watching searchers interact with the search pages and trains a mapping of eye features to gaze locations and search page elements on the screen. Contrary to typical eye tracking studies in information retrieval, this approach does not require the purchase of any additional specialized equipment, and can be done remotely in a user's natural environment, leading to cheaper and easier visual attention studies. While SearchGazer is not intended to be as accurate as specialized eye trackers, it is able to replicate many of the research findings of three seminal information retrieval papers: two that used eye tracking devices, and one that used the mouse cursor as a restricted focus viewer. Charts and heatmaps from those original papers are plotted side-by-side with SearchGazer results. While the main results are similar, there are some notable differences, which we hypothesize derive from improvements in the latest ranking technologies used by current versions of search engines and diligence by remote users. As part of this paper, we also release SearchGazer as a library that can be integrated into any search page. 
    more » « less