Abstract Real-world work environments require operators to perform multiple tasks with continual support from an automated system. Eye movement is often used as a surrogate measure of operator attention, yet conventional summary measures such as percent dwell time do not capture dynamic transitions of attention in complex visual workspace. This study analyzed eye movement data collected in a controlled a MATB-II task environment using gaze transition entropy analysis. In the study, human subjects performed a compensatory tracking task, a system monitoring task, and a communication task concurrently. The results indicate that both gaze transition entropy and stationary gaze entropy, measures of randomness in eye movements, decrease when the compensatory tracking task required more continuous monitoring. The findings imply that gaze transition entropy reflects attention allocation of operators performing dynamic operational tasks consistently.
more »
« less
Implementing mobile eye tracking in psychological research: A practical guide
Abstract Eye tracking provides direct, temporally and spatially sensitive measures of eye gaze. It can capture visual attention patterns from infancy through adulthood. However, commonly used screen-based eye tracking (SET) paradigms are limited in their depiction of how individuals process information as they interact with the environment in “real life”. Mobile eye tracking (MET) records participant-perspective gaze in the context of active behavior. Recent technological developments in MET hardware enable researchers to capture egocentric vision as early as infancy and across the lifespan. However, challenges remain in MET data collection, processing, and analysis. The present paper aims to provide an introduction and practical guide to starting researchers in the field to facilitate the use of MET in psychological research with a wide range of age groups. First, we provide a general introduction to MET. Next, we briefly review MET studies in adults and children that provide new insights into attention and its roles in cognitive and socioemotional functioning. We then discuss technical issues relating to MET data collection and provide guidelines for data quality inspection, gaze annotations, data visualization, and statistical analyses. Lastly, we conclude by discussing the future directions of MET implementation. Open-source programs for MET data quality inspection, data visualization, and analysis are shared publicly.
more »
« less
- Award ID(s):
- 1941449
- PAR ID:
- 10533520
- Publisher / Repository:
- Springer Science + Business Media
- Date Published:
- Journal Name:
- Behavior Research Methods
- Volume:
- 56
- Issue:
- 8
- ISSN:
- 1554-3528
- Format(s):
- Medium: X Size: p. 8269-8288
- Size(s):
- p. 8269-8288
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
With improved portability and affordability, eye tracking devices have facilitated an expanding range of cycling experiments aimed at understanding cycling behavior and potential risks. Given the complexity of cyclists’ visual behavior and gaze measurements, we provide a comprehensive review with three key focuses: 1) the adoption and interpretation of various gaze metrics derived from cycling experiments, 2) a summary of the findings of those experiments, and 3) identifying areas for future research. A systematic review of three databases yielded thirty-five articles that met our inclusion criteria. Our review results show that cycling experiments with eye tracking allow analysis of the viewpoint of the cyclist and reactions to the built environment, road conditions, navigation behavior, and mental workload and/or stress levels. Our review suggests substantial variation in research objectives and the consequent selection of eye-tracking devices, experimental design, and which gaze metrics are used and interpreted. A variety of general gaze metrics and gaze measurements related to Areas of Interest (AOI) are applied to infer cyclists’ mental workload/stress levels and attention allocation respectively. The diversity of gaze metrics reported in the literature makes cross-study comparisons difficult. Areas for future research, especially potential integration with computer vision are also discussed.more » « less
-
Accurate eye tracking is crucial for gaze-dependent research, but calibrating eye trackers in subjects who cannot follow instructions, such as human infants and nonhuman primates, presents a challenge. Traditional calibration methods rely on verbal instructions, which are ineffective for these populations. To address this, researchers often use attention-grabbing stimuli in known locations; however, existing software for video-based calibration is often proprietary and inflexible. We introduce an extension to the open-source toolbox Titta—a software package integrating desktop Tobii eye trackers with PsychToolbox experiments—to facilitate custom video-based calibration. This toolbox extension offers a flexible platform for attracting attention, calibrating using flexible point selection, and validating the calibration. The toolbox has been refined through extensive use with chimpanzees, baboons, and macaques, demonstrating its effectiveness across species. Our adaptive calibration and validation procedures provide a standardized method for achieving more accurate gaze tracking, enhancing gaze accuracy across diverse species.more » « less
-
Abstract Humans detect faces efficiently from a young age. Face detection is critical for infants to identify and learn from relevant social stimuli in their environments. Faces with eye contact are an especially salient stimulus, and attention to the eyes in infancy is linked to the emergence of later sociality. Despite the importance of both of these early social skills—attending to faces and attending to the eyes—surprisingly little is known about how they interact. We used eye tracking to explore whether eye contact influences infants' face detection. Longitudinally, we examined 2‐, 4‐, and 6‐month‐olds' (N = 65) visual scanning of complex image arrays with human and animal faces varying in eye contact and head orientation. Across all ages, infants displayed superior detection of faces with eye contact; however, this effect varied as a function of species and head orientation. Infants were more attentive to human than animal faces and were more sensitive to eye and head orientation for human faces compared to animal faces. Unexpectedly, human faces with both averted heads and eyes received the most attention. This pattern may reflect the early emergence of gaze following—the ability to look where another individual looks—which begins to develop around this age. Infants may be especially interested in averted gaze faces, providing early scaffolding for joint attention. This study represents the first investigation to document infants' attention patterns to faces systematically varying in their attentional states. Together, these findings suggest that infants develop early, specialized functional conspecific face detection.more » « less
-
In the realm of virtual reality (VR) research, the synergy of methodological advancements, technical innovation, and novel applications is paramount. Our work encapsulates these facets in the context of spatial ability assessments conducted within a VR environment. This paper presents a comprehensive and integrated framework of VR, eye-tracking, and electroencephalography (EEG), which seamlessly combines measuring participants’ behavioral performance and simultaneously collecting time-stamped eye tracking and EEG data to enable understanding how spatial ability is impacted in certain conditions and if such conditions demand increased attention and mental allocation. This framework encompasses the measurement of participants’ gaze pattern (e.g., fixation and saccades), EEG data (e.g., Alpha, Beta, Gamma, and Theta wave patterns), and psychometric and behavioral test performance. On the technical front, we utilized the Unity 3D game engine as the core for running our spatial ability tasks by simulating altered conditions of space exploration. We simulated two types of space exploration conditions: (1) microgravity condition in which participants’ idiotropic (body) axis is in statically and dynamically misaligned with their visual axis; and (2) conditions of Martian terrain that offers a visual frame of reference (FOR) but with limited and unfamiliar landmarks objects. We specifically targeted assessing human spatial ability and spatial perception. To assess spatial ability, we digitalized behavioral tests of Purdue Spatial Visualization Test: Rotations (PSVT: R), the Mental Cutting Test (MCT), and the Perspective Taking Ability (PTA) test and integrated them into the VR settings to evaluate participants’ spatial visualization, spatial relations, and spatial orientation ability, respectively. For spatial perception, we applied digitalized versions of size and distance perception tests to measure participants’ subjective perception of size and distance. A suite of C# scripts orchestrated the VR experience, enabling real-time data collection and synchronization. This technical innovation includes the integration of data streams from diverse sources, such as VIVE controllers, eye-tracking devices, and EEG hardware, to ensure a cohesive and comprehensive dataset. A pivotal challenge in our research was synchronizing data from EEG, eye tracking, and VR tasks to facilitate comprehensive analysis. To address this challenge, we employed the Unity interface of the OpenSync library, a tool designed to unify disparate data sources in the fields of psychology and neuroscience. This approach ensures that all collected measures share a common time reference, enabling meaningful analysis of participant performance, gaze behavior, and EEG activity. The Unity-based system seamlessly incorporates task parameters, participant data, and VIVE controller inputs, providing a versatile platform for conducting assessments in diverse domains. Finally, we were able to collect synchronized measurements of participants’ scores on the behavioral tests of spatial ability and spatial perception, their gaze data and EEG data. In this paper, we present the whole process of combining the eye-tracking and EEG workflows into the VR settings and collecting relevant measurements. We believe that our work not only advances the state-of-the-art in spatial ability assessments but also underscores the potential of virtual reality as a versatile tool in cognitive research, therapy, and rehabilitation.more » « less
An official website of the United States government
