Observers routinely make errors in almost any visual search task. In previous online experiments, we found that indiscriminately highlighting all item positions in a noisy search display reduces errors. In the present paper, we conducted two eye tracking studies to investigate the mechanics of this error reduction: does cueing direct attention to previously overlooked regions or enhance attention/processing at cued locations? Displays were presented twice. In Experiment 1, for half of the displays, the cue was only presented on the first copy (Cue - noCue) and for the other half, only presented on the second copy (noCue - Cue). Cueing successfully reduced errors but did not significantly affect RTs. This contrasts with the online experiment where the cue increased RTs while reducing errors. In Experiment 2, we replicated the design of the online experiment by splitting the displays into noCue – noCue and noCue – Cue pairs. We now found that the cue reduced errors, but increased RTs on trials with high- contrast targets. The eye tracking data shows that participants fixated closer to items and fixation durations were shorter in cued displays. The smaller fixation-item distance reduced search errors, where observers never fixated the target, for low contrast targets and the remaining low-contrast errors seemed to be recognition errors, where observers looked at the target but quickly looked away. Taken together, these results suggest that errors were reduced because attention was more properly directed to overlooked regions by the cues instead of being enhanced at the cued areas.
more »
« less
Eccentricity effects on blur and depth perception
Foveation and (de)focus are two important visual factors in designing near eye displays. Foveation can reduce computational load by lowering display details towards the visual periphery, while focal cues can reduce vergence-accommodation conflict thereby lessening visual discomfort in using near eye displays. We performed two psychophysical experiments to investigate the relationship between foveation and focus cues. The first study measured blur discrimination sensitivity as a function of visual eccentricity, where we found discrimination thresholds significantly lower than previously reported. The second study measured depth discrimination threshold where we found a clear dependency on visual eccentricity. We discuss the study results and suggest further investigation.
more »
« less
- Award ID(s):
- 1650499
- PAR ID:
- 10135676
- Publisher / Repository:
- Optical Society of America
- Date Published:
- Journal Name:
- Optics Express
- Volume:
- 28
- Issue:
- 5
- ISSN:
- 1094-4087; OPEXFF
- Format(s):
- Medium: X Size: Article No. 6734
- Size(s):
- Article No. 6734
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Blascheck, Tanja; Bradshaw, Jessica; Vrzakova, Hana (Ed.)Virtual Reality (VR) technology has advanced to include eye-tracking, allowing novel research, such as investigating how our visual system coordinates eye movements with changes in perceptual depth. The purpose of this study was to examine whether eye tracking could track perceptual depth changes during a visual discrimination task. We derived two depth-dependent variables from eye tracker data: eye vergence angle (EVA) and interpupillary distance (IPD). As hypothesized, our results revealed that shifting gaze from near-to-far depth significantly decreased EVA and increased IPD, while the opposite pattern was observed while shifting from far-to-near. Importantly, the amount of change in these variables tracked closely with relative changes in perceptual depth, and supported the hypothesis that eye tracker data may be used to infer real-time changes in perceptual depth in VR. Our method could be used as a new tool to adaptively render information based on depth and improve the VR user experience.more » « less
-
Understanding how individuals focus and perform visual searches during collaborative tasks can help improve user engagement. Eye tracking measures provide informative cues for such understanding. This article presents A-DisETrac, an advanced analytic dashboard for distributed eye tracking. It uses off-the-shelf eye trackers to monitor multiple users in parallel, compute both traditional and advanced gaze measures in real-time, and display them on an interactive dashboard. Using two pilot studies, the system was evaluated in terms of user experience and utility, and compared with existing work. Moreover, the system was used to study how advanced gaze measures such as ambient-focal coefficient K and real-time index of pupillary activity relate to collaborative behavior. It was observed that the time a group takes to complete a puzzle is related to the ambient visual scanning behavior quantified and groups that spent more time had more scanning behavior. User experience questionnaire results suggest that their dashboard provides a comparatively good user experience.more » « less
-
Abstract Recent studies have documented substantial variability among typical listeners in how gradiently they categorize speech sounds, and this variability in categorization gradience may link to how listeners weight different cues in the incoming signal. The present study tested the relationship between categorization gradience and cue weighting across two sets of English contrasts, each varying orthogonally in two acoustic dimensions. Participants performed a four‐alternative forced‐choice identification task in a visual world paradigm while their eye movements were monitored. We found that (a) greater categorization gradience derived from behavioral identification responses corresponds to larger secondary cue weights derived from eye movements; (b) the relationship between categorization gradience and secondary cue weighting is observed across cues and contrasts, suggesting that categorization gradience may be a consistent within‐individual property in speech perception; and (c) listeners who showed greater categorization gradience tend to adopt a buffered processing strategy, especially when cues arrive asynchronously in time.more » « less
-
Drifting student attention is a common problem in educational environments. We demonstrate 8 attention-restoring visual cues for display when eye tracking detects that student attention shifts away from critical objects. These cues include novel aspects and variations of standard cues that performed well in prior work on visual guidance. Our cues are integrated into an offshore training system on an oil rig. While students participate in training on the oil rig, we can compare our various cues in terms of performance and student preference, while also observing the impact of eye tracking. We demonstrate experiment software with which users can compare various cues and tune selected parameters for visual quality and effectiveness.more » « less
An official website of the United States government
