Abstract Faces are among the most important visual stimuli that humans perceive in everyday life. While extensive literature has examined emotional processing and social evaluations of faces, most studies have examined either topic using unimodal approaches. In this review, we promote the use of multimodal cognitive neuroscience approaches to study these processes, using two lines of research as examples: ambiguity in facial expressions of emotion and social trait judgment of faces. In the first set of studies, we identified an event‐related potential that signals emotion ambiguity using electroencephalography and we found convergent neural responses to emotion ambiguity using functional neuroimaging and single‐neuron recordings. In the second set of studies, we discuss how different neuroimaging and personality‐dimensional approaches together provide new insights into social trait judgments of faces. In both sets of studies, we provide an in‐depth comparison between neurotypicals and people with autism spectrum disorder. We offer a computational account for the behavioral and neural markers of the different facial processing between the two groups. Finally, we suggest new practices for studying the emotional processing and social evaluations of faces. All data discussed in the case studies of this review are publicly available.
more »
« less
Functional connectivity between the amygdala and prefrontal cortex underlies processing of emotion ambiguity
Abstract Processing facial expressions of emotion draws on a distributed brain network. In particular, judging ambiguous facial emotions involves coordination between multiple brain areas. Here, we applied multimodal functional connectivity analysis to achieve network-level understanding of the neural mechanisms underlying perceptual ambiguity in facial expressions. We found directional effective connectivity between the amygdala, dorsomedial prefrontal cortex (dmPFC), and ventromedial PFC, supporting both bottom-up affective processes for ambiguity representation/perception and top-down cognitive processes for ambiguity resolution/decision. Direct recordings from the human neurosurgical patients showed that the responses of amygdala and dmPFC neurons were modulated by the level of emotion ambiguity, and amygdala neurons responded earlier than dmPFC neurons, reflecting the bottom-up process for ambiguity processing. We further found parietal-frontal coherence and delta-alpha cross-frequency coupling involved in encoding emotion ambiguity. We replicated the EEG coherence result using independent experiments and further showed modulation of the coherence. EEG source connectivity revealed that the dmPFC top-down regulated the activities in other brain regions. Lastly, we showed altered behavioral responses in neuropsychiatric patients who may have dysfunctions in amygdala-PFC functional connectivity. Together, using multimodal experimental and analytical approaches, we have delineated a neural network that underlies processing of emotion ambiguity.
more »
« less
- Award ID(s):
- 1945230
- PAR ID:
- 10485151
- Publisher / Repository:
- Nature
- Date Published:
- Journal Name:
- Translational Psychiatry
- Volume:
- 13
- Issue:
- 1
- ISSN:
- 2158-3188
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Face perception is a fundamental aspect of human social interaction, yet most research on this topic has focused on single modalities and specific aspects of face perception. Here, we present a comprehensive multimodal dataset for examining facial emotion perception and judgment. This dataset includes EEG data from 97 unique neurotypical participants across 8 experiments, fMRI data from 19 neurotypical participants, single-neuron data from 16 neurosurgical patients (22 sessions), eye tracking data from 24 neurotypical participants, behavioral and eye tracking data from 18 participants with ASD and 15 matched controls, and behavioral data from 3 rare patients with focal bilateral amygdala lesions. Notably, participants from all modalities performed the same task. Overall, this multimodal dataset provides a comprehensive exploration of facial emotion perception, emphasizing the importance of integrating multiple modalities to gain a holistic understanding of this complex cognitive process. This dataset serves as a key missing link between human neuroimaging and neurophysiology literature, and facilitates the study of neuropsychiatric populations.more » « less
-
Abstract Most of the research in the field of affective computing has focused on detecting and classifying human emotions through electroencephalogram (EEG) or facial expressions. Designing multimedia content to evoke certain emotions has been largely motivated by manual rating provided by users. Here we present insights from the correlation of affective features between three modalities namely, affective multimedia content, EEG, and facial expressions. Interestingly, low-level Audio-visual features such as contrast and homogeneity of the video and tone of the audio in the movie clips are most correlated with changes in facial expressions and EEG. We also detect the regions associated with the human face and the brain (in addition to the EEG frequency bands) that are most representative of affective responses. The computational modeling between the three modalities showed a high correlation between features from these regions and user-reported affective labels. Finally, the correlation between different layers of convolutional neural networks with EEG and Face images as input provides insights into human affection. Together, these findings will assist in (1) designing more effective multimedia contents to engage or influence the viewers, (2) understanding the brain/body bio-markers of affection, and (3) developing newer brain-computer interfaces as well as facial-expression-based algorithms to read emotional responses of the viewers.more » « less
-
Chronic exposure to uncontrollable stress causes loss of spines and dendrites in the prefrontal cortex (PFC), a recently evolved brain region that provides top-down regulation of thought, action, and emotion. PFC neurons generate top-down goals through recurrent excitatory connections on spines. This persistent firing is the foundation for higher cognition, including working memory, and abstract thought. However, exposure to acute uncontrollable stress drives high levels of catecholamine release in the PFC, which activates feedforward calcium-cAMP signaling pathways to open nearby potassium channels, rapidly weakening synaptic connectivity to reduce persistent firing. Chronic stress exposures can further exacerbate these signaling events leading to loss of spines and resulting in marked cognitive impairment. In this review, we discuss how stress signaling mechanisms can lead to spine loss, including changes to BDNF-mTORC1 signaling, calcium homeostasis, actin dynamics, and mitochondrial actions that engage glial removal of spines through inflammatory signaling. Stress signaling events may be amplified in PFC spines due to cAMP magnification of internal calcium release. As PFC dendritic spine loss is a feature of many cognitive disorders, understanding how stress affects the structure and function of the PFC will help to inform strategies for treatment and prevention.more » « less
-
Abstract Faces are salient social stimuli that attract a stereotypical pattern of eye movement. The human amygdala and hippocampus are involved in various aspects of face processing; however, it remains unclear how they encode the content of fixations when viewing faces. To answer this question, we employed single-neuron recordings with simultaneous eye tracking when participants viewed natural face stimuli. We found a class of neurons in the human amygdala and hippocampus that encoded salient facial features such as the eyes and mouth. With a control experiment using non-face stimuli, we further showed that feature selectivity was specific to faces. We also found another population of neurons that differentiated saccades to the eyes vs. the mouth. Population decoding confirmed our results and further revealed the temporal dynamics of face feature coding. Interestingly, we found that the amygdala and hippocampus played different roles in encoding facial features. Lastly, we revealed two functional roles of feature-selective neurons: 1) they encoded the salient region for face recognition, and 2) they were related to perceived social trait judgments. Together, our results link eye movement with neural face processing and provide important mechanistic insights for human face perception.more » « less
An official website of the United States government

