Neural, physiological, and behavioral signals synchronize between human subjects in a variety of settings. Multiple hypotheses have been proposed to explain this interpersonal synchrony, but there is no clarity under which conditions it arises, for which signals, or whether there is a common underlying mechanism. We hypothesized that cognitive processing of a shared stimulus is the source of synchrony between subjects, measured here as intersubject correlation (ISC). To test this, we presented informative videos to participants in an attentive and distracted condition and subsequently measured information recall. ISC was observed for electro-encephalography, gaze position, pupil size, and heart rate, but not respiration and head movements. The strength of correlation was co-modulated in the different signals, changed with attentional state, and predicted subsequent recall of information presented in the videos. There was robust within-subject coupling between brain, heart, and eyes, but not respiration or head movements. The results suggest that ISC is the result of effective cognitive processing, and thus emerges only for those signals that exhibit a robust brain–body connection. While physiological and behavioral fluctuations may be driven by multiple features of the stimulus, correlation with other individuals is co-modulated by the level of attentional engagement with the stimulus.
- Award ID(s):
- 1660548
- Publication Date:
- NSF-PAR ID:
- 10380501
- Journal Name:
- PNAS Nexus
- Volume:
- 1
- Issue:
- 1
- ISSN:
- 2752-6542
- Publisher:
- Oxford University Press
- Sponsoring Org:
- National Science Foundation
More Like this
-
Heart rate has natural fluctuations that are typically ascribed to autonomic function. Recent evidence suggests that conscious processing can affect the timing of the heartbeat. We hypothesized that heart rate is modulated by conscious processing and therefore dependent on attentional focus. To test this, we leverage the observation that neural processes can be synchronized between subjects by presenting an identical narrative stimulus. As predicted, we find significant inter-subject correlation of the heartbeat (ISC-HR) when subjects are presented with an auditory or audiovisual narrative. Consistent with the conscious processing hypothesis, we find that ISC-HR is reduced when subjects are distracted from the narrative, and that higher heart rate synchronization predicts better recall of the narrative. Finally, patients with disorders of consciousness who are listening to a story have lower ISC-HR, as compared to healthy individuals, and that individual ISC-HR might predict a patients’ prognosis.. We conclude that heart rate fluctuations are partially driven by conscious processing, depend on attentional state, and may represent a simple metric to assess conscious state in unresponsive patients.
-
Vital signs (e.g., heart and respiratory rate) are indicative for health status assessment. Efforts have been made to extract vital signs using radio frequency (RF) techniques (e.g., Wi-Fi, FMCW, UWB), which offer a non-touch solution for continuous and ubiquitous monitoring without users’ cooperative efforts. While RF-based vital signs monitoring is user-friendly, its robustness faces two challenges. On the one hand, the RF signal is modulated by the periodic chest wall displacement due to heartbeat and breathing in a nonlinear manner. It is inherently hard to identify the fundamental heart and respiratory rates (HR and RR) in the presence of higher order harmonics of them and intermodulation between HR and RR, especially when they have overlapping frequency bands. On the other hand, the inadvertent body movements may disturb and distort the RF signal, overwhelming the vital signals, thus inhibiting the parameter estimation of the physiological movement (i.e., heartbeat and breathing). In this paper, we propose DeepVS, a deep learning approach that addresses the aforementioned challenges from the non-linearity and inadvertent movements for robust RF-based vital signs sensing in a unified manner. DeepVS combines 1D CNN and attention models to exploit local features and temporal correlations. Moreover, it leverages a two-stream schememore »
-
Emotion regulation can be characterized by different activities that attempt to alter an emotional response, whether behavioral, physiological or neurological. The two most widely adopted strategies, cognitive reappraisal and expressive suppression are explored in this study, specifically in the context of disgust. Study participants (N = 21) experienced disgust via video exposure, and were instructed to either regulate their emotions or express them freely. If regulating, they were required to either cognitively reappraise or suppress their emotional experiences while viewing the videos. Video recordings of the participants' faces were taken during the experiment and electrocardiogram (ECG), electromyography (EMG), and galvanic skin response (GSR) readings were also collected for further analysis. We compared the participants behavioral (facial musculature movements) and physiological (GSR and heart rate) responses as they aimed to alter their emotional responses and computationally determined that when responding to disgust stimuli, the signals recorded during suppression and free expression were very similar, whereas those recorded during cognitive reappraisal were significantly different. Thus, in the context of this study, from a signal analysis perspective, we conclude that emotion regulation via cognitive reappraisal significantly alters participants' physiological responses to disgust, unlike regulation via suppression.
-
As we comprehend narratives, our attentional engagement fluctuates over time. Despite theoretical conceptions of narrative engagement as emotion-laden attention, little empirical work has characterized the cognitive and neural processes that comprise subjective engagement in naturalistic contexts or its consequences for memory. Here, we relate fluctuations in narrative engagement to patterns of brain coactivation and test whether neural signatures of engagement predict subsequent memory. In behavioral studies, participants continuously rated how engaged they were as they watched a television episode or listened to a story. Self-reported engagement was synchronized across individuals and driven by the emotional content of the narratives. In functional MRI datasets collected as different individuals watched the same show or listened to the same story, engagement drove neural synchrony, such that default mode network activity was more synchronized across individuals during more engaging moments of the narratives. Furthermore, models based on time-varying functional brain connectivity predicted evolving states of engagement across participants and independent datasets. The functional connections that predicted engagement overlapped with a validated neuromarker of sustained attention and predicted recall of narrative events. Together, our findings characterize the neural signatures of attentional engagement in naturalistic contexts and elucidate relationships among narrative engagement, sustained attention, and eventmore »
-
Abstract Objective. Reorienting is central to how humans direct attention to different stimuli in their environment. Previous studies typically employ well-controlled paradigms with limited eye and head movements to study the neural and physiological processes underlying attention reorienting. Here, we aim to better understand the relationship between gaze and attention reorienting using a naturalistic virtual reality (VR)-based target detection paradigm.Approach. Subjects were navigated through a city and instructed to count the number of targets that appeared on the street. Subjects performed the task in a fixed condition with no head movement and in a free condition where head movements were allowed. Electroencephalography (EEG), gaze and pupil data were collected. To investigate how neural and physiological reorienting signals are distributed across different gaze events, we used hierarchical discriminant component analysis (HDCA) to identify EEG and pupil-based discriminating components. Mixed-effects general linear models (GLM) were used to determine the correlation between these discriminating components and the different gaze events time. HDCA was also used to combine EEG, pupil and dwell time signals to classify reorienting events.Main results. In both EEG and pupil, dwell time contributes most significantly to the reorienting signals. However, when dwell times were orthogonalized against other gaze events, the distributions of themore »