The human face conveys a wealth of information, including traits, states, and intentions. Just as fundamentally, the face also signals the humanity of a person. In the current research we report two experiments providing evidence that disruptions of configural face encoding affect the temporal dynamics of categorization during attempts to distinguish human from non-human faces. Specifically, the present experiments utilize mouse-tracking and find that face inversion elicits confusion amongst human and non-human categories early in the processing of human faces. This work affords the first examination of how facial inversion affects the dynamic processes underlying categorization of human and non-human faces.
more »
« less
Inversion Reduces Sensitivity to Complex Emotions in Eye Regions
Inferring humans’ complex emotions is challenging but can be done with surprisingly limited emotion signals, including merely the eyes alone. Here, we test for a role of lower-level perceptual processes involved in such sensitivity using the well-validated Reading the Mind in the Eyes task. Over three experiments, we manipulated configural processing to show that it contributes to sensitivity to complex emotion from human eye regions. Specifically, inversion, a well-established manipulation affecting configural processing, undermined sensitivity to complex emotions in eye regions (Experiments 1-3). Inversion extended to undermine sensitivity to nonmentalistic information from human eye regions (gender; Experiment 2) but did not extend to affect sensitivity to attributes of nonhuman animals (Experiment 3). Taken together, the current findings provide evidence for the novel hypothesis that configural processing facilitates sensitivity to complex emotions conveyed by the eyes via the broader extraction of socially relevant information.
more »
« less
- Award ID(s):
- 1748461
- PAR ID:
- 10396095
- Date Published:
- Journal Name:
- Social Cognition
- Volume:
- 40
- Issue:
- 3
- ISSN:
- 0278-016X
- Page Range / eLocation ID:
- 302 to 315
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Reading is a highly complex learned skill in which humans move their eyes three to four times every second in response to visual and cognitive processing. The consensus view is that the details of these rapid eye-movement decisions—which part of a word to target with a saccade—are determined solely by low-level oculomotor heuristics. But maximally efficient saccade targeting would be sensitive to ongoing word identification, sending the eyes farther into a word the farther its identification has already progressed. Here, using a covert text-shifting paradigm, we showed just such a statistical relationship between saccade targeting in reading and trial-to-trial variability in cognitive processing. This result suggests that, rather than relying purely on heuristics, the human brain has learned to optimize eye movements in reading even at the fine-grained level of character-position targeting, reflecting efficiency-based sensitivity to ongoing cognitive processing.more » « less
-
Abstract Meta‐analytic techniques for mining the neuroimaging literature continue to exert an impact on our conceptualization of functional brain networks contributing to human emotion and cognition. Traditional theories regarding the neurobiological substrates contributing to affective processing are shifting from regional‐ towards more network‐based heuristic frameworks. To elucidate differential brain network involvement linked to distinct aspects of emotion processing, we applied an emergent meta‐analytic clustering approach to the extensive body of affective neuroimaging results archived in the BrainMap database. Specifically, we performed hierarchical clustering on the modeled activation maps from 1,747 experiments in the affective processing domain, resulting in five meta‐analytic groupings of experiments demonstrating whole‐brain recruitment. Behavioral inference analyses conducted for each of these groupings suggested dissociable networks supporting: (1) visual perception within primary and associative visual cortices, (2) auditory perception within primary auditory cortices, (3) attention to emotionally salient information within insular, anterior cingulate, and subcortical regions, (4) appraisal and prediction of emotional events within medial prefrontal and posterior cingulate cortices, and (5) induction of emotional responses within amygdala and fusiform gyri. These meta‐analytic outcomes are consistent with a contemporary psychological model of affective processing in which emotionally salient information from perceived stimuli are integrated with previous experiences to engender a subjective affective response. This study highlights the utility of using emergent meta‐analytic methods to inform and extend psychological theories and suggests that emotions are manifest as the eventual consequence of interactions between large‐scale brain networks.more » « less
-
Pedagogical agents are animated characters embedded within an e-learning environment to facilitate learning. With the growing understanding of the complex interplay between emotions and cognition, there is a need to design agents that can provide believable simulated emotional interactions with the learner. Best practices from the animation industry could be used to improve the believability of the agents. A well-known best practice is that the movements of limbs/torso/head play the most important role in conveying the character's emotion, followed by eyes/face and lip sync, respectively, in a long/medium shot. The researchers' study tested the validity of this best practice using statistical methods. It investigated the contribution of 3 body channels (torso/limbs/head, face, speech) to the expression of 5 emotions (happiness, sadness, anger, fear, surprise) in a stylized agent in a full body shot. Findings confirm the biggest contributor to the perceived believability of the animated emotion is the character's body, followed by face and speech respectively, across 4 out of 5 emotions.more » « less
-
We present a System for Processing In-situ Bio-signal Data for Emotion Recognition and Sensing (SPIDERS)- a low-cost, wireless, glasses-based platform for continuous in-situ monitoring of user's facial expressions (apparent emotions) and real emotions. We present algorithms to provide four core functions (eye shape and eyebrow movements, pupillometry, zygomaticus muscle movements, and head movements), using the bio-signals acquired from three non-contact sensors (IR camera, proximity sensor, IMU). SPIDERS distinguishes between different classes of apparent and real emotion states based on the aforementioned four bio-signals. We prototype advanced functionalities including facial expression detection and real emotion classification with a landmarks and optical flow based facial expression detector that leverages changes in a user's eyebrows and eye shapes to achieve up to 83.87% accuracy, as well as a pupillometry-based real emotion classifier with higher accuracy than other low-cost wearable platforms that use sensors requiring skin contact. SPIDERS costs less than $20 to assemble and can continuously run for up to 9 hours before recharging. We demonstrate that SPIDERS is a truly wireless and portable platform that has the capability to impact a wide range of applications, where knowledge of the user's emotional state is critical.more » « less
An official website of the United States government

