skip to main content


Title: The gaze of a social monkey is perceptible to conspecifics and predators but not prey
Eye gaze is an important source of information for animals, implicated in communication, cooperation, hunting and antipredator behaviour. Gaze perception and its cognitive underpinnings are much studied in primates, but the specific features that are used to estimate gaze can be difficult to isolate behaviourally. We photographed 13 laboratory-housed tufted capuchin monkeys ( Sapajus [Cebus] apella ) to quantify chromatic and achromatic contrasts between their iris, pupil, sclera and skin. We used colour vision models to quantify the degree to which capuchin eye gaze is discriminable to capuchins, their predators and their prey. We found that capuchins, regardless of their colour vision phenotype, as well as their predators, were capable of effectively discriminating capuchin gaze across ecologically relevant distances. Their prey, in contrast, were not capable of discriminating capuchin gaze, even under relatively ideal conditions. These results suggest that specific features of primate eyes can influence gaze perception, both within and across species.  more » « less
Award ID(s):
1926327
NSF-PAR ID:
10339120
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Proceedings of the Royal Society B: Biological Sciences
Volume:
289
Issue:
1976
ISSN:
0962-8452
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Visual attention to facial features is an important way that group-living primate species gain knowledge about others. However, where this attention is focused on the face is influenced by contextual and social features, and emerging evidence in Pan species suggests that oxytocin, a hormone involved in forming and maintaining affiliative bonds among members of the same group, influences social attention as measured by eye gaze. Specifically, bonobos tend to focus on conspecifics’ eyes when viewing two-dimensional images, whereas chimpanzees focus more on the edges of the face. Moreover, exogenous oxytocin, which was hypothesized to increase eye contact in both species, instead enhanced this existing difference. We follow up on this to (1) determine the degree to which this Pan pattern generalizes across highly social, cooperative non-ape primates and (2) explore the impact of exogenously administered vs. endogenously released oxytocin in impacting this behavior. To do so, we tracked gaze direction on a computerized social categorization task using conspecific faces in tufted capuchin monkeys ( Sapajus [Cebus] apella ) after (1) exogenously administering intranasal oxytocin using a nebulizer or (2) inducing an endogenous increase in oxytocin using fur-rubbing, previously validated to increase oxytocin in capuchins. Overall, we did not find a general tendency in the capuchins to look toward the eyes or mouth, but we found that oxytocin was related to looking behavior toward these regions, albeit not in a straightforward way. Considering frequency of looking per trial, monkeys were more likely to look at the eye region in the fur-rubbing condition as compared to either the saline or exogenous oxytocin conditions. However, in terms of duration of looking during trials in which they did look at the eye region, monkeys spent significantly less time looking at the eyes in both oxytocin conditions as compared to the saline condition. These results suggest that oxytocin did not necessarily enhance eye looking in capuchins, which is consistent with the results from Pan species, and that endogenous and exogenous oxytocin may behave differently in their effect on how social attention is allocated. 
    more » « less
  2. Abstract Chimpanzee ( Pan troglodytes ) sclera appear much darker than the white sclera of human eyes, to such a degree that the direction of chimpanzee gaze may be concealed from conspecifics. Recent debate surrounding this topic has produced mixed results, with some evidence suggesting that (1) primate gaze is indeed concealed from their conspecifics, and (2) gaze colouration is among the suite of traits that distinguish uniquely social and cooperative humans from other primates (the cooperative eye hypothesis). Using a visual modelling approach that properly accounts for specific-specific vision, we reexamined this topic to estimate the extent to which chimpanzee eye coloration is discriminable. We photographed the faces of captive chimpanzees and quantified the discriminability of their pupil, iris, sclera, and surrounding skin. We considered biases of cameras, lighting conditions, and commercial photography software along with primate visual acuity, colour sensitivity, and discrimination ability. Our visual modeling of chimpanzee eye coloration suggests that chimpanzee gaze is visible to conspecifics at a range of distances (within approximately 10 m) appropriate for many species-typical behaviours. We also found that chimpanzee gaze is discriminable to the visual system of primates that chimpanzees prey upon, Colobus monkeys. Chimpanzee sclera colour does not effectively conceal gaze, and we discuss this result with regard to the cooperative eye hypothesis, the evolution of primate eye colouration, and methodological best practices for future primate visual ecology research. 
    more » « less
  3. Abstract

    Objective.Reorienting is central to how humans direct attention to different stimuli in their environment. Previous studies typically employ well-controlled paradigms with limited eye and head movements to study the neural and physiological processes underlying attention reorienting. Here, we aim to better understand the relationship between gaze and attention reorienting using a naturalistic virtual reality (VR)-based target detection paradigm.Approach.Subjects were navigated through a city and instructed to count the number of targets that appeared on the street. Subjects performed the task in a fixed condition with no head movement and in a free condition where head movements were allowed. Electroencephalography (EEG), gaze and pupil data were collected. To investigate how neural and physiological reorienting signals are distributed across different gaze events, we used hierarchical discriminant component analysis (HDCA) to identify EEG and pupil-based discriminating components. Mixed-effects general linear models (GLM) were used to determine the correlation between these discriminating components and the different gaze events time. HDCA was also used to combine EEG, pupil and dwell time signals to classify reorienting events.Main results.In both EEG and pupil, dwell time contributes most significantly to the reorienting signals. However, when dwell times were orthogonalized against other gaze events, the distributions of the reorienting signals were different across the two modalities, with EEG reorienting signals leading that of the pupil reorienting signals. We also found that the hybrid classifier that integrates EEG, pupil and dwell time features detects the reorienting signals in both the fixed (AUC = 0.79) and the free (AUC = 0.77) condition.Significance.We show that the neural and ocular reorienting signals are distributed differently across gaze events when a subject is immersed in VR, but nevertheless can be captured and integrated to classify target vs. distractor objects to which the human subject orients.

     
    more » « less
  4. Li-Jessen, Nicole Yee-Key (Ed.)
    The Earable device is a behind-the-ear wearable originally developed to measure cognitive function. Since Earable measures electroencephalography (EEG), electromyography (EMG), and electrooculography (EOG), it may also have the potential to objectively quantify facial muscle and eye movement activities relevant in the assessment of neuromuscular disorders. As an initial step to developing a digital assessment in neuromuscular disorders, a pilot study was conducted to determine whether the Earable device could be utilized to objectively measure facial muscle and eye movements intended to be representative of Performance Outcome Assessments, (PerfOs) with tasks designed to model clinical PerfOs, referred to as mock-PerfO activities. The specific aims of this study were: To determine whether the Earable raw EMG, EOG, and EEG signals could be processed to extract features describing these waveforms; To determine Earable feature data quality, test re-test reliability, and statistical properties; To determine whether features derived from Earable could be used to determine the difference between various facial muscle and eye movement activities; and, To determine what features and feature types are important for mock-PerfO activity level classification. A total of N = 10 healthy volunteers participated in the study. Each study participant performed 16 mock-PerfOs activities, including talking, chewing, swallowing, eye closure, gazing in different directions, puffing cheeks, chewing an apple, and making various facial expressions. Each activity was repeated four times in the morning and four times at night. A total of 161 summary features were extracted from the EEG, EMG, and EOG bio-sensor data. Feature vectors were used as input to machine learning models to classify the mock-PerfO activities, and model performance was evaluated on a held-out test set. Additionally, a convolutional neural network (CNN) was used to classify low-level representations of the raw bio-sensor data for each task, and model performance was correspondingly evaluated and compared directly to feature classification performance. The model’s prediction accuracy on the Earable device’s classification ability was quantitatively assessed. Study results indicate that Earable can potentially quantify different aspects of facial and eye movements and may be used to differentiate mock-PerfO activities. Specially, Earable was found to differentiate talking, chewing, and swallowing tasks from other tasks with observed F1 scores >0.9. While EMG features contribute to classification accuracy for all tasks, EOG features are important for classifying gaze tasks. Finally, we found that analysis with summary features outperformed a CNN for activity classification. We believe Earable may be used to measure cranial muscle activity relevant for neuromuscular disorder assessment. Classification performance of mock-PerfO activities with summary features enables a strategy for detecting disease-specific signals relative to controls, as well as the monitoring of intra-subject treatment responses. Further testing is needed to evaluate the Earable device in clinical populations and clinical development settings. 
    more » « less
  5. Abstract Background Many snakes are low-energy predators that use crypsis to ambush their prey. Most of these species feed very infrequently, are sensitive to the presence of larger vertebrates, such as humans, and spend large portions of their lifetime hidden. This makes direct observation of feeding behaviour challenging, and previous methodologies developed for documenting predation behaviours of free-ranging snakes have critical limitations. Animal-borne accelerometers have been increasingly used by ecologists to quantify activity and moment-to-moment behaviour of free ranging animals, but their application in snakes has been limited to documenting basic behavioural states (e.g., active vs. non-active). High-frequency accelerometry can provide new insight into the behaviour of this important group of predators, and here we propose a new method to quantify key aspects of the feeding behaviour of three species of viperid snakes ( Crotalus spp.) and assess the transferability of classification models across those species. Results We used open-source software to create species-specific models that classified locomotion, stillness, predatory striking, and prey swallowing with high precision, accuracy, and recall. In addition, we identified a low cost, reliable, non-invasive attachment method for accelerometry devices to be placed anteriorly on snakes, as is likely necessary for accurately classifying distinct behaviours in these species. However, species-specific models had low transferability in our cross-species comparison. Conclusions Overall, our study demonstrates the strong potential for using accelerometry to document critical feeding behaviours in snakes that are difficult to observe directly. Furthermore, we provide an ‘end-to-end’ template for identifying important behaviours involved in the foraging ecology of viperids using high-frequency accelerometry. We highlight a method of attachment of accelerometers, a technique to simulate feeding events in captivity, and a model selection procedure using biologically relevant window sizes in an open-access software for analyzing acceleration data (AcceleRater). Although we were unable to obtain a generalized model across species, if more data are incorporated from snakes across different body sizes and different contexts (i.e., moving through natural habitat), general models could potentially be developed that have higher transferability. 
    more » « less