skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: An oculomotor signature as a fraud-resistant tool for biometric verification
Instant access to personal data is a double-edged sword and it has transformed society. It enhances convenience and interpersonal interactions through social media, while also making us all more vulnerable to identity theft and cybercrime. The need for hack-resistant biometric authentication is greater than ever. Previous studies have demonstrated that eye movements differ between individuals, so the characterization eye movements might provide a highly secure and convenient approach to personal identification, because eye movements are generated by the owner’s living brain in real-time and are therefore extremely difficult to imitate by hackers. To study the potential of eye movements as a biometric tool, we characterized the eye movements of 18 participants. We examined an entire battery of oculomotor behaviors, including the unconscious eye movements that occur during ocular fixation; this resulted in a high precision oculomotor signature that can identify individuals. We show that one-versus-one machine learning classification, applied with a nearest neighbor statistic, yielded an accuracy of >99% based with ~25minute sessions, during which participants executed fixations, visual pursuits, free viewing of images, etc. Even if we just examine the ~3 minutes in which participants executed the fixation task by itself, discrimination accuracy was higher than 96%. When we further split the fixation data randomly into 30 sec chunks, we obtained a remarkably high accuracy of 92%. Because eye-trackers provide improved spatial and temporal resolution with each new generation, we expect that both accuracy and the minimum sample duration necessary for reliable oculomotor biometric verification can be further optimized.  more » « less
Award ID(s):
1523614 1734887 0726113
PAR ID:
10430782
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Society for Neuroscience Annual Meeting 2018
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Visual working memory (VWM) representations interact with attentional guidance, but there is controversy over whether multiple VWM items simultaneously influence attentional guidance. Extant studies relied on continuous variables like response times, which can obscure capture – especially if VWM representations cycle through interactive and non-interactive states. Previous conflicting findings regarding guidance when under high working memory (WM) load may be due to the use of noisier response time measures that mix capture and non-capture trials. Thus, we employed an oculomotor paradigm to characterize discrete attentional capture events under both high and low VWM load. Participants held one or two colors in memory, then executed a saccade to a target disk. On some trials, a distractor (sometimes VWM-matching) appeared simultaneously with the target. Eye movements were more frequently directed to a VWM-matching than a non-matching distractor for both load conditions. However, oculomotor capture by a VWM-matching distractor occurred less frequently under high compared with low load. These results suggest that attention is automatically guided toward items matching only one of two colors held in memory at a time, suggesting that items in VWM may cycle through attention-guiding and not-guiding states when more than one item is held in VWM and the task does not require that multiple items be maintained in an active, attention-guiding state. 
    more » « less
  2. Ribeiro, Haroldo V. (Ed.)
    Reading is a complex cognitive process that involves primary oculomotor function and high-level activities like attention focus and language processing. When we read, our eyes move by primary physiological functions while responding to language-processing demands. In fact, the eyes perform discontinuous twofold movements, namely, successive long jumps (saccades) interposed by small steps (fixations) in which the gaze “scans” confined locations. It is only through the fixations that information is effectively captured for brain processing. Since individuals can express similar as well as entirely different opinions about a given text, it is therefore expected that the form, content and style of a text could induce different eye-movement patterns among people. A question that naturally arises is whether these individuals’ behaviours are correlated, so that eye-tracking while reading can be used as a proxy for text subjective properties. Here we perform a set of eye-tracking experiments with a group of individuals reading different types of texts, including children stories, random word generated texts and excerpts from literature work. In parallel, an extensive Internet survey was conducted for categorizing these texts in terms of their complexity and coherence, considering a large number of individuals selected according to different ages, gender and levels of education. The computational analysis of the fixation maps obtained from the gaze trajectories of the subjects for a given text reveals that the average “magnetization” of the fixation configurations correlates strongly with their complexity observed in the survey. Moreover, we perform a thermodynamic analysis using the Maximum-Entropy Model and find that coherent texts were closer to their corresponding “critical points” than non-coherent ones, as computed from the Pairwise Maximum-Entropy method, suggesting that different texts may induce distinct cohesive reading activities. 
    more » « less
  3. BackgroundUpper limb proprioceptive impairments are common after stroke and affect daily function. Recent work has shown that stroke survivors have difficulty using visual information to improve proprioception. It is unclear how eye movements are impacted to guide action of the arm after stroke. Here, we aimed to understand how upper limb proprioceptive impairments impact eye movements in individuals with stroke. MethodsControl (N = 20) and stroke participants (N = 20) performed a proprioceptive matching task with upper limb and eye movements. A KINARM exoskeleton with eye tracking was used to assess limb and eye kinematics. The upper limb was passively moved by the robot and participants matched the location with either an arm or eye movement. Accuracy was measured as the difference between passive robot movement location and active limb matching (Hand-End Point Error) or active eye movement matching (Eye-End Point Error). ResultsWe found that individuals with stroke had significantly larger Hand (2.1×) and Eye-End Point (1.5×) Errors compared to controls. Further, we found that proprioceptive errors of the hand and eye were highly correlated in stroke participants ( r = .67, P = .001), a relationship not observed for controls. ConclusionsEye movement accuracy declined as a function of proprioceptive impairment of the more-affected limb, which was used as a proprioceptive reference. The inability to use proprioceptive information of the arm to coordinate eye movements suggests that disordered proprioception impacts integration of sensory information across different modalities. These results have important implications for how vision is used to actively guide limb movement during rehabilitation. 
    more » « less
  4. null (Ed.)
    Metric learning is a valuable technique for enabling the ongoing enrollment of new users within biometric systems. While this approach has been heavily employed for other biometric modalities such as facial recognition, applications to eye movements have only recently been explored. This manuscript further investigates the application of metric learning to eye movement biometrics. A set of three multilayer perceptron networks are trained for embedding feature vectors describing three classes of eye movements: fixations, saccades, and post-saccadic oscillations. The network is validated on a dataset containing eye movement traces of 269 subjects recorded during a reading task. The proposed algorithm is benchmarked against a previously introduced statistical biometric approach. While mean equal error rate (EER) was increased versus the benchmark method, the proposed technique demonstrated lower dispersion in EER across the four test folds considered herein. 
    more » « less
  5. Almost 400 years ago, Rubens copied Titian's The Fall of Man, albeit with important changes. Rubens altered Titian's original composition in numerous ways, including by changing the gaze directions of the depicted characters and adding a striking red parrot to the painting. Here, we quantify the impact of Rubens's choices on the viewer's gaze behavior. We displayed digital copies of Rubens's and Titian's artworks—as well as a version of Rubens's painting with the parrot digitally removed—on a computer screen while recording the eye movements produced by observers during free visual exploration of each image. To assess the effects of Rubens's changes to Titian's composition, we directly compared multiple gaze parameters across the different images. We found that participants gazed at Eve's face more frequently in Rubens's painting than in Titian's. In addition, gaze positions were more tightly focused for the former than for the latter, consistent with different allocations of viewer interest. We also investigated how gaze fixation on Eve's face affected the perceptual visibility of the parrot in Rubens's composition and how the parrot's presence versus its absence impacted gaze dynamics. Taken together, our results demonstrate that Rubens's critical deviations from Titian's painting have powerful effects on viewers’ oculomotor behavior. 
    more » « less