skip to main content


Title: Self-Conscious Emotions and the Right Fronto-Temporal and Right Temporal Parietal Junction
For more than two decades, research focusing on both clinical and non-clinical populations has suggested a key role for specific regions in the regulation of self-conscious emotions. It is speculated that both the expression and the interpretation of self-conscious emotions are critical in humans for action planning and response, communication, learning, parenting, and most social encounters. Empathy, Guilt, Jealousy, Shame, and Pride are all categorized as self-conscious emotions, all of which are crucial components to one’s sense of self. There has been an abundance of evidence pointing to the right Fronto-Temporal involvement in the integration of cognitive processes underlying the expression of these emotions. Numerous regions within the right hemisphere have been identified including the right temporal parietal junction (rTPJ), the orbitofrontal cortex (OFC), and the inferior parietal lobule (IPL). In this review, we aim to investigate patient cases, in addition to clinical and non-clinical studies. We also aim to highlight these specific brain regions pivotal to the right hemispheric dominance observed in the neural correlates of such self-conscious emotions and provide the potential role that self-conscious emotions play in evolution.  more » « less
Award ID(s):
1909824
NSF-PAR ID:
10395750
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
Brain Sciences
Volume:
12
Issue:
2
ISSN:
2076-3425
Page Range / eLocation ID:
138
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Non‐conscious processing of human memory has traditionally been difficult to objectively measure and thus understand. A prior study on a group of hippocampal amnesia (N= 3) patients and healthy controls (N= 6) used a novel procedure for capturing neural correlates of implicit memory using event‐related potentials (ERPs): old and new items were equated for varying levels of memory awareness, with ERP differences observed from 400 to 800 ms in bilateral parietal regions that were hippocampal‐dependent. The current investigation sought to address the limitations of that study by increasing the sample of healthy subjects (N = 54), applying new controls for construct validity, and developing an improved, open‐source tool for automated analysis of the procedure used for equating levels of memory awareness. Results faithfully reproduced prior ERP findings of parietal effects that a series of systematic control analyses validated were not contributed to nor contaminated by explicit memory. Implicit memory effects extended from 600 to 1000 ms, localized to right parietal sites. These ERP effects were found to be behaviorally relevant and specific in predicting implicit memory response times, and were topographically dissociable from other traditional ERP measures of implicit memory (miss vs. correct rejections) that instead occurred in left parietal regions. Results suggest first that equating for reported awareness of memory strength is a valid, powerful new method for revealing neural correlates of non‐conscious human memory, and second, behavioral correlations suggest that these implicit effects reflect a pure form of priming, whereas misses represent fluency leading to the subjective experience of familiarity.

     
    more » « less
  2. Introduction: Back pain is one of the most common causes of pain in the United States. Spinal cord stimulation (SCS) is an intervention for patients with chronic back pain (CBP). However, SCS decreases pain in only 58% of patients and relies on self-reported pain scores as outcome measures. An SCS trial is temporarily implanted for seven days and helps to determine if a permanent SCS is needed. Patients that have a >50% reduction in pain from the trial stimulator makes them eligible for permanent implantation. However, self-reported measures reveal little on how mechanisms in the brain are altered. Other measurements of pain intensity, onset, medication, disabilities, depression, and anxiety have been used with machine learning to predict outcomes with accuracies <70%. We aim to predict long-term SCS responders at 6-months using baseline resting EEG and machine learning. Materials and Methods: We obtained 10-minutes of resting electroencephalography (EEG) and pain questionnaires from nine participants with CBP at two time points: 1) pre-trial baseline. 2) Six months after SCS permanent implant surgery. Subjects were designated as high or moderate responders based on the amount of pain relief provided by the long-term (post six months) SCS, and pain scored on a scale of 0-10 with 0 being no pain and 10 intolerable. We used the resting EEG from baseline to predict long-term treatment outcome. Resting EEG data was fed through a pipeline for classification and to map dipole sources. EEG signals were preprocessed using the EEGLAB toolbox. Independent component analysis and dipole fitting were used to linearly unmix the signal and to map dipole sources from the brain. Spectral analysis was performed to obtain the frequency distribution of the signal. Each power band, delta (1-4 Hz), theta (4-8 Hz), alpha (8-13 Hz), beta (13-30 Hz), and gamma (30-100 Hz), as well as the entire spectrum (1-100 Hz), were used for classification. Furthermore, dipole sources were ranked based on classification feature weights to determine the significance of specific regions in the brain. We used support vector machines to predict pain outcomes. Results and Discussion: We found higher frequency powerbands provide overall classification performance of 88.89%. Differences in power are seen between moderate and high responders in both the frontal and parietal regions for theta, alpha, beta, and the entire spectrum (Fig.1). This can potentially be used to predict patient response to SCS. Conclusions: We found evidence of decreased power in theta, alpha, beta, and entire spectrum in the anterior regions of the parietal cortex and posterior regions of the frontal cortex between moderate and high responders, which can be used for predicting treatment outcomes in long-term pain relief from SCS. Long-term treatment outcome prediction using baseline EEG data has the potential to contribute to decision making in terms of permanent surgery, forgo trial periods, and improve clinical efficiency by beginning to understand the mechanism of action of SCS in the human brain. 
    more » « less
  3. Abstract

    Though the right hemisphere has been implicated in talker processing, it is thought to play a minimal role in phonetic processing, at least relative to the left hemisphere. Recent evidence suggests that the right posterior temporal cortex may support learning of phonetic variation associated with a specific talker. In the current study, listeners heard a male talker and a female talker, one of whom produced an ambiguous fricative in /s/-biased lexical contexts (e.g., epi?ode) and one who produced it in /∫/-biased contexts (e.g., friend?ip). Listeners in a behavioral experiment (Experiment 1) showed evidence of lexically guided perceptual learning, categorizing ambiguous fricatives in line with their previous experience. Listeners in an fMRI experiment (Experiment 2) showed differential phonetic categorization as a function of talker, allowing for an investigation of the neural basis of talker-specific phonetic processing, though they did not exhibit perceptual learning (likely due to characteristics of our in-scanner headphones). Searchlight analyses revealed that the patterns of activation in the right superior temporal sulcus (STS) contained information about who was talking and what phoneme they produced. We take this as evidence that talker information and phonetic information are integrated in the right STS. Functional connectivity analyses suggested that the process of conditioning phonetic identity on talker information depends on the coordinated activity of a left-lateralized phonetic processing system and a right-lateralized talker processing system. Overall, these results clarify the mechanisms through which the right hemisphere supports talker-specific phonetic processing.

     
    more » « less
  4. Abstract

    Rhythm perception depends on the ability to predict the onset of rhythmic events. Previous studies indicate beta band modulation is involved in predicting the onset of auditory rhythmic events (Fujioka et al., 2009, 2012; Snyder & Large, 2005). We sought to determine if similar processes are recruited for prediction of visual rhythms by investigating whether beta band activity plays a role in a modality‐dependent manner for rhythm perception. We looked at electroencephalography time–frequency neural correlates of prediction using an omission paradigm with auditory and visual rhythms. By using omissions, we can separate out predictive timing activity from stimulus‐driven activity. We hypothesized that there would be modality‐independent markers of rhythm prediction in induced beta band oscillatory activity, and our results support this hypothesis. We find induced and evoked predictive timing in both auditory and visual modalities. Additionally, we performed an exploratory‐independent components‐based spatial clustering analysis, and describe all resulting clusters. This analysis reveals that there may be overlapping networks of predictive beta activity based on common activation in the parietal and right frontal regions, auditory‐specific predictive beta in bilateral sensorimotor regions, and visually specific predictive beta in midline central, and bilateral temporal/parietal regions. This analysis also shows evoked predictive beta activity in the left sensorimotor region specific to auditory rhythms and implicates modality‐dependent networks for auditory and visual rhythm perception.

     
    more » « less
  5. Abstract

    Modulation of vocal pitch is a key speech feature that conveys important linguistic and affective information. Auditory feedback is used to monitor and maintain pitch. We examined induced neural high gamma power (HGP) (65–150 Hz) using magnetoencephalography during pitch feedback control. Participants phonated into a microphone while hearing their auditory feedback through headphones. During each phonation, a single real‐time 400 ms pitch shift was applied to the auditory feedback. Participants compensated by rapidly changing their pitch to oppose the pitch shifts. This behavioral change required coordination of the neural speech motor control network, including integration of auditory and somatosensory feedback to initiate change in motor plans. We found increases in HGP across both hemispheres within 200 ms of pitch shifts, covering left sensory and right premotor, parietal, temporal, and frontal regions, involved in sensory detection and processing of the pitch shift. Later responses to pitch shifts (200–300 ms) were right dominant, in parietal, frontal, and temporal regions. Timing of activity in these regions indicates their role in coordinating motor change and detecting and processing of the sensory consequences of this change. Subtracting out cortical responses during passive listening to recordings of the phonations isolated HGP increases specific to speech production, highlighting right parietal and premotor cortex, and left posterior temporal cortex involvement in the motor response. Correlation of HGP with behavioral compensation demonstrated right frontal region involvement in modulating participant's compensatory response. This study highlights the bihemispheric sensorimotor cortical network involvement in auditory feedback‐based control of vocal pitch.Hum Brain Mapp 37:1474‐1485, 2016. © 2016 Wiley Periodicals, Inc.

     
    more » « less