skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.
Attention:The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 7:00 AM ET to 7:30 AM ET on Friday, April 24 due to maintenance. We apologize for the inconvenience.


Title: Eccentricity effects on blur and depth perception
Foveation and (de)focus are two important visual factors in designing near eye displays. Foveation can reduce computational load by lowering display details towards the visual periphery, while focal cues can reduce vergence-accommodation conflict thereby lessening visual discomfort in using near eye displays. We performed two psychophysical experiments to investigate the relationship between foveation and focus cues. The first study measured blur discrimination sensitivity as a function of visual eccentricity, where we found discrimination thresholds significantly lower than previously reported. The second study measured depth discrimination threshold where we found a clear dependency on visual eccentricity. We discuss the study results and suggest further investigation.  more » « less
Award ID(s):
1650499
PAR ID:
10218809
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
Optics express
Volume:
28
Issue:
5
ISSN:
1094-4087
Page Range / eLocation ID:
6734-6739
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Observers routinely make errors in almost any visual search task. In previous online experiments, we found that indiscriminately highlighting all item positions in a noisy search display reduces errors. In the present paper, we conducted two eye tracking studies to investigate the mechanics of this error reduction: does cueing direct attention to previously overlooked regions or enhance attention/processing at cued locations? Displays were presented twice. In Experiment 1, for half of the displays, the cue was only presented on the first copy (Cue - noCue) and for the other half, only presented on the second copy (noCue - Cue). Cueing successfully reduced errors but did not significantly affect RTs. This contrasts with the online experiment where the cue increased RTs while reducing errors. In Experiment 2, we replicated the design of the online experiment by splitting the displays into noCue – noCue and noCue – Cue pairs. We now found that the cue reduced errors, but increased RTs on trials with high- contrast targets. The eye tracking data shows that participants fixated closer to items and fixation durations were shorter in cued displays. The smaller fixation-item distance reduced search errors, where observers never fixated the target, for low contrast targets and the remaining low-contrast errors seemed to be recognition errors, where observers looked at the target but quickly looked away. Taken together, these results suggest that errors were reduced because attention was more properly directed to overlooked regions by the cues instead of being enhanced at the cued areas. 
    more » « less
  2. Blascheck, Tanja; Bradshaw, Jessica; Vrzakova, Hana (Ed.)
    Virtual Reality (VR) technology has advanced to include eye-tracking, allowing novel research, such as investigating how our visual system coordinates eye movements with changes in perceptual depth. The purpose of this study was to examine whether eye tracking could track perceptual depth changes during a visual discrimination task. We derived two depth-dependent variables from eye tracker data: eye vergence angle (EVA) and interpupillary distance (IPD). As hypothesized, our results revealed that shifting gaze from near-to-far depth significantly decreased EVA and increased IPD, while the opposite pattern was observed while shifting from far-to-near. Importantly, the amount of change in these variables tracked closely with relative changes in perceptual depth, and supported the hypothesis that eye tracker data may be used to infer real-time changes in perceptual depth in VR. Our method could be used as a new tool to adaptively render information based on depth and improve the VR user experience. 
    more » « less
  3. Understanding how individuals focus and perform visual searches during collaborative tasks can help improve user engagement. Eye tracking measures provide informative cues for such understanding. This article presents A-DisETrac, an advanced analytic dashboard for distributed eye tracking. It uses off-the-shelf eye trackers to monitor multiple users in parallel, compute both traditional and advanced gaze measures in real-time, and display them on an interactive dashboard. Using two pilot studies, the system was evaluated in terms of user experience and utility, and compared with existing work. Moreover, the system was used to study how advanced gaze measures such as ambient-focal coefficient K and real-time index of pupillary activity relate to collaborative behavior. It was observed that the time a group takes to complete a puzzle is related to the ambient visual scanning behavior quantified and groups that spent more time had more scanning behavior. User experience questionnaire results suggest that their dashboard provides a comparatively good user experience. 
    more » « less
  4. Drifting student attention is a common problem in educational environments. We demonstrate 8 attention-restoring visual cues for display when eye tracking detects that student attention shifts away from critical objects. These cues include novel aspects and variations of standard cues that performed well in prior work on visual guidance. Our cues are integrated into an offshore training system on an oil rig. While students participate in training on the oil rig, we can compare our various cues in terms of performance and student preference, while also observing the impact of eye tracking. We demonstrate experiment software with which users can compare various cues and tune selected parameters for visual quality and effectiveness. 
    more » « less
  5. Holographic displays promise to deliver unprecedented display capabilities in augmented reality applications, featuring a wide field of view, wide color gamut, spatial resolution, and depth cues all in a compact form factor. While emerging holographic display approaches have been successful in achieving large étendue and high image quality as seen by a camera, the large étendue also reveals a problem that makes existing displays impractical: the sampling of the holographic field by the eye pupil. Existing methods have not investigated this issue due to the lack of displays with large enough étendue, and, as such, they suffer from severe artifacts with varying eye pupil size and location. We show that the holographic field as sampled by the eye pupil is highly varying for existing display setups, and we propose pupil-aware holography that maximizes the perceptual image quality irrespective of the size, location, and orientation of the eye pupil in a near-eye holographic display. We validate the proposed approach both in simulations and on a prototype holographic display and show that our method eliminates severe artifacts and significantly outperforms existing approaches. 
    more » « less