skip to main content

This content will become publicly available on December 1, 2024

Title: Functional connectivity between the amygdala and prefrontal cortex underlies processing of emotion ambiguity

Processing facial expressions of emotion draws on a distributed brain network. In particular, judging ambiguous facial emotions involves coordination between multiple brain areas. Here, we applied multimodal functional connectivity analysis to achieve network-level understanding of the neural mechanisms underlying perceptual ambiguity in facial expressions. We found directional effective connectivity between the amygdala, dorsomedial prefrontal cortex (dmPFC), and ventromedial PFC, supporting both bottom-up affective processes for ambiguity representation/perception and top-down cognitive processes for ambiguity resolution/decision. Direct recordings from the human neurosurgical patients showed that the responses of amygdala and dmPFC neurons were modulated by the level of emotion ambiguity, and amygdala neurons responded earlier than dmPFC neurons, reflecting the bottom-up process for ambiguity processing. We further found parietal-frontal coherence and delta-alpha cross-frequency coupling involved in encoding emotion ambiguity. We replicated the EEG coherence result using independent experiments and further showed modulation of the coherence. EEG source connectivity revealed that the dmPFC top-down regulated the activities in other brain regions. Lastly, we showed altered behavioral responses in neuropsychiatric patients who may have dysfunctions in amygdala-PFC functional connectivity. Together, using multimodal experimental and analytical approaches, we have delineated a neural network that underlies processing of emotion ambiguity.

more » « less
Award ID(s):
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
Date Published:
Journal Name:
Translational Psychiatry
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Faces are among the most important visual stimuli that humans perceive in everyday life. While extensive literature has examined emotional processing and social evaluations of faces, most studies have examined either topic using unimodal approaches. In this review, we promote the use of multimodal cognitive neuroscience approaches to study these processes, using two lines of research as examples: ambiguity in facial expressions of emotion and social trait judgment of faces. In the first set of studies, we identified an event‐related potential that signals emotion ambiguity using electroencephalography and we found convergent neural responses to emotion ambiguity using functional neuroimaging and single‐neuron recordings. In the second set of studies, we discuss how different neuroimaging and personality‐dimensional approaches together provide new insights into social trait judgments of faces. In both sets of studies, we provide an in‐depth comparison between neurotypicals and people with autism spectrum disorder. We offer a computational account for the behavioral and neural markers of the different facial processing between the two groups. Finally, we suggest new practices for studying the emotional processing and social evaluations of faces. All data discussed in the case studies of this review are publicly available.

    more » « less
  2. Abstract

    Face perception is a fundamental aspect of human social interaction, yet most research on this topic has focused on single modalities and specific aspects of face perception. Here, we present a comprehensive multimodal dataset for examining facial emotion perception and judgment. This dataset includes EEG data from 97 unique neurotypical participants across 8 experiments, fMRI data from 19 neurotypical participants, single-neuron data from 16 neurosurgical patients (22 sessions), eye tracking data from 24 neurotypical participants, behavioral and eye tracking data from 18 participants with ASD and 15 matched controls, and behavioral data from 3 rare patients with focal bilateral amygdala lesions. Notably, participants from all modalities performed the same task. Overall, this multimodal dataset provides a comprehensive exploration of facial emotion perception, emphasizing the importance of integrating multiple modalities to gain a holistic understanding of this complex cognitive process. This dataset serves as a key missing link between human neuroimaging and neurophysiology literature, and facilitates the study of neuropsychiatric populations.

    more » « less
  3. Abstract

    Goal-directed behavior is dependent on neuronal activity in the prefrontal cortex (PFC) and extended frontostriatal circuitry. Stress and stress-related disorders are associated with impaired frontostriatal-dependent cognition. Our understanding of the neural mechanisms that underlie stress-related cognitive impairment is limited, with the majority of prior research focused on the PFC. To date, the actions of stress across cognition-related frontostriatal circuitry are unknown. To address this gap, the current studies examined the effects of acute noise-stress on the spiking activity of neurons and local field potential oscillatory activity within the dorsomedial PFC (dmPFC) and dorsomedial striatum (dmSTR) in rats engaged in a test of spatial working memory. Stress robustly suppressed responses of both dmPFC and dmSTR neurons strongly tuned to key task events (delay, reward). Additionally, stress strongly suppressed delay-related, but not reward-related, theta and alpha spectral power within, and synchrony between, the dmPFC and dmSTR. These observations provide the first demonstration that stress disrupts the neural coding and functional connectivity of key task events, particularly delay, within cognition-supporting dorsomedial frontostriatal circuitry. These results suggest that stress-related degradation of neural coding within both the PFC and striatum likely contributes to the cognition-impairing effects of stress.

    more » « less
  4. Abstract

    Most of the research in the field of affective computing has focused on detecting and classifying human emotions through electroencephalogram (EEG) or facial expressions. Designing multimedia content to evoke certain emotions has been largely motivated by manual rating provided by users. Here we present insights from the correlation of affective features between three modalities namely, affective multimedia content, EEG, and facial expressions. Interestingly, low-level Audio-visual features such as contrast and homogeneity of the video and tone of the audio in the movie clips are most correlated with changes in facial expressions and EEG. We also detect the regions associated with the human face and the brain (in addition to the EEG frequency bands) that are most representative of affective responses. The computational modeling between the three modalities showed a high correlation between features from these regions and user-reported affective labels. Finally, the correlation between different layers of convolutional neural networks with EEG and Face images as input provides insights into human affection. Together, these findings will assist in (1) designing more effective multimedia contents to engage or influence the viewers, (2) understanding the brain/body bio-markers of affection, and (3) developing newer brain-computer interfaces as well as facial-expression-based algorithms to read emotional responses of the viewers.

    more » « less
  5. Chronic exposure to uncontrollable stress causes loss of spines and dendrites in the prefrontal cortex (PFC), a recently evolved brain region that provides top-down regulation of thought, action, and emotion. PFC neurons generate top-down goals through recurrent excitatory connections on spines. This persistent firing is the foundation for higher cognition, including working memory, and abstract thought. However, exposure to acute uncontrollable stress drives high levels of catecholamine release in the PFC, which activates feedforward calcium-cAMP signaling pathways to open nearby potassium channels, rapidly weakening synaptic connectivity to reduce persistent firing. Chronic stress exposures can further exacerbate these signaling events leading to loss of spines and resulting in marked cognitive impairment. In this review, we discuss how stress signaling mechanisms can lead to spine loss, including changes to BDNF-mTORC1 signaling, calcium homeostasis, actin dynamics, and mitochondrial actions that engage glial removal of spines through inflammatory signaling. Stress signaling events may be amplified in PFC spines due to cAMP magnification of internal calcium release. As PFC dendritic spine loss is a feature of many cognitive disorders, understanding how stress affects the structure and function of the PFC will help to inform strategies for treatment and prevention. 
    more » « less