skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Using Microsaccades to Estimate Task Difficulty During Visual Search of Layered Surfaces
We develop an approach to using microsaccade dynamics for the measurement of task difficulty/cognitive load imposed by a visual search task of a layered surface. Previous studies provide converging evidence that task difficulty/cognitive load can influence microsaccade activity. We corroborate this notion. Specifically, we explore this relationship during visual search for features embedded in a terrain-like surface, with the eyes allowed to move freely during the task. We make two relevant contributions. First, we validate an approach to distinguishing between the ambient and focal phases of visual search. We show that this spectrum of visual behavior can be quantified by a single previously reported estimator, known as Krejtz's K coefficient. Second, we use ambient/focal segments based on K as a moderating factor for microsaccade analysis in response to task difficulty. We find that during the focal phase of visual search (a) microsaccade magnitude increases significantly, and (b) microsaccade rate decreases significantly, with increased task difficulty. We conclude that the combined use of K and microsaccade analysis may be helpful in building effective tools that provide an indication of the level of cognitive activity within a task while the task is being performed.  more » « less
Award ID(s):
1748380
PAR ID:
10096512
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
IEEE Transactions on Visualization and Computer Graphics
Volume:
14
Issue:
8
ISSN:
1077-2626
Page Range / eLocation ID:
1 to 1
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Changes in task demands can have delayed adverse impacts on performance. This phenomenon, known as the workload history effect, is especially of concern in dynamic work domains where operators manage fluctuating task demands. The existing workload history literature does not depict a consistent picture regarding how these effects manifest, prompting research to consider measures that are informative on the operator's process. One promising measure is visual attention patterns, due to its informativeness on various cognitive processes. To explore its ability to explain workload history effects, participants completed a task in an unmanned aerial vehicle command and control testbed where workload transitioned gradually and suddenly. The participants’ performance and visual attention patterns were studied over time to identify workload history effects. The eye-tracking analysis consisted of using a recently developed eye-tracking metric called coefficient K , as it indicates whether visual attention is more focal or ambient. The performance results found workload history effects, but it depended on the workload level, time elapsed, and performance measure. The eye-tracking analysis suggested performance suffered when focal attention was deployed during low workload, which was an unexpected finding. When synthesizing these results, they suggest unexpected visual attention patterns can impact performance immediately over time. Further research is needed; however, this work shows the value of including a real-time visual attention measure, such as coefficient K , as a means to understand how the operator manages varying task demands in complex work environments. 
    more » « less
  2. Abstract Unconscious neural activity has been shown to precede both motor and cognitive acts. In the present study, we investigated the neural antecedents of overt attention during visual search, where subjects make voluntary saccadic eye movements to search a cluttered stimulus array for a target item. Building on studies of both overt self-generated motor actions (Lau et al., 2004, Soon et al., 2008) and self-generated cognitive actions (Bengson et al., 2014, Soon et al., 2013), we hypothesized that brain activity prior to the onset of a search array would predict the direction of the first saccade during unguided visual search. Because both spatial attention and gaze are coordinated during visual search, both cognition and motor actions are coupled during visual search. A well-established finding in fMRI studies of willed action is that neural antecedents of the intention to make a motor act (e.g., reaching) can be identified seconds before the action occurs. Studies of the volitional control ofcovertspatial attention in EEG have shown that predictive brain activity is limited to only a few hundred milliseconds before a voluntary shift of covert spatial attention. In the present study, the visual search task and stimuli were designed so that subjects could not predict the onset of the search array. Perceptual task difficulty was high, such that they could not locate the target using covert attention alone, thus requiring overt shifts of attention (saccades) to carry out the visual search. If the first saccade to the array onset in unguided visual search shares mechanisms with willed shifts of covert attention, we expected predictive EEG alpha-band activity (8-12 Hz) immediately prior to the array onset (within 1 sec) (Bengson et al., 2014; Nadra et al., 2023). Alternatively, if they follow the principles of willed motor actions, predictive neural signals should be reflected in broadband EEG activity (Libet et al., 1983) and would likely emerge earlier (Soon et al., 2008). Applying support vector machine decoding, we found that the direction of the first saccade in an unguided visual search could be predicted up to two seconds preceding the search array’s onset in the broadband but not alpha-band EEG. These findings suggest that self-directed eye movements in visual search emerge from early preparatory neural activity more akin to willed motor actions than to covert willed attention. This highlights a distinct role for unconscious neural dynamics in shaping visual search behavior. 
    more » « less
  3. A large portion of the cost of any software lies in the time spent by developers in understanding a program’s source code before any changes can be undertaken. Measuring program comprehension is not a trivial task. In fact, different studies use self-reported and various psycho-physiological measures as proxies. In this research, we propose a methodology using functional Near Infrared Spectroscopy (fNIRS) and eye tracking devices as an objective measure of program comprehension that allows researchers to conduct studies in environments close to real world settings, at identifier level of granularity. We validate our methodology and apply it to study the impact of lexical, structural, and readability issues on developers’ cognitive load during bug localization tasks. Our study involves 25 undergraduate and graduate students and 21 metrics. Results show that the existence of lexical inconsistencies in the source code significantly increases the cognitive load experienced by participants not only on identifiers involved in the inconsistencies but also throughout the entire code snippet. We did not find statistical evidence that structural inconsistencies increase the average cognitive load that participants experience, however, both types of inconsistencies result in lower performance in terms of time and success rate. Finally, we observe that self-reported task difficulty, cognitive load, and fixation duration do not correlate and appear to be measuring different aspects of task difficulty. 
    more » « less
  4. A novel eye-tracked measure of pupil diameter oscillation is derived as an indicator of cognitive load. The new metric, termed the Low/High Index of Pupillary Activity (LHIPA), is able to discriminate cognitive load (vis-à-vis task difficulty) in several experiments where the Index of Pupillary Activity fails to do so. Rationale for the LHIPA is tied to the functioning of the human autonomic nervous system yielding a hybrid measure based on the ratio of Low/High frequencies of pupil oscillation. The paper’s contribution is twofold. First, full documentation is provided for the calculation of the LHIPA. As with the IPA, it is possible for researchers to apply this metric to their own experiments where a measure of cognitive load is of interest. Second, robustness of the LHIPA is shown in analysis of three experiments, a restrictive fixed-gaze number counting task, a less restrictive fixed-gaze n-back task, and an applied eye-typing task. 
    more » « less
  5. Abstract The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient–focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals. 
    more » « less