skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Synchronized affect in shared experiences strengthens social connection
Abstract People structure their days to experience events with others. We gather to eat meals, watch TV, and attend concerts together. What constitutes a shared experience and how does it manifest in dyadic behavior? The present study investigates how shared experiences—measured through emotional, motoric, physiological, and cognitive alignment—promote social bonding. We recorded the facial expressions and electrodermal activity (EDA) of participants as they watched four episodes of a TV show for a total of 4 h with another participant. Participants displayed temporally synchronized and spatially aligned emotional facial expressions and the degree of synchronization predicted the self-reported social connection ratings between viewing partners. We observed a similar pattern of results for dyadic physiological synchrony measured via EDA and their cognitive impressions of the characters. All four of these factors, temporal synchrony of positive facial expressions, spatial alignment of expressions, EDA synchrony, and character impression similarity, contributed to a latent factor of a shared experience that predicted social connection. Our findings suggest that the development of interpersonal affiliations in shared experiences emerges from shared affective experiences comprising synchronous processes and demonstrate that these complex interpersonal processes can be studied in a holistic and multi-modal framework leveraging naturalistic experimental designs.  more » « less
Award ID(s):
1848370
PAR ID:
10471317
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
Nature Publishing Group
Date Published:
Journal Name:
Communications Biology
Volume:
6
Issue:
1
ISSN:
2399-3642
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Neural, physiological, and behavioral signals synchronize between human subjects in a variety of settings. Multiple hypotheses have been proposed to explain this interpersonal synchrony, but there is no clarity under which conditions it arises, for which signals, or whether there is a common underlying mechanism. We hypothesized that cognitive processing of a shared stimulus is the source of synchrony between subjects, measured here as intersubject correlation (ISC). To test this, we presented informative videos to participants in an attentive and distracted condition and subsequently measured information recall. ISC was observed for electro-encephalography, gaze position, pupil size, and heart rate, but not respiration and head movements. The strength of correlation was co-modulated in the different signals, changed with attentional state, and predicted subsequent recall of information presented in the videos. There was robust within-subject coupling between brain, heart, and eyes, but not respiration or head movements. The results suggest that ISC is the result of effective cognitive processing, and thus emerges only for those signals that exhibit a robust brain–body connection. While physiological and behavioral fluctuations may be driven by multiple features of the stimulus, correlation with other individuals is co-modulated by the level of attentional engagement with the stimulus. 
    more » « less
  2. Academic Abstract Interpersonal synchrony, the alignment of behavior and/or physiology during interactions, is a pervasive phenomenon observed in diverse social contexts. Here we synthesize across contexts and behaviors to classify the different forms and functions of synchrony. We provide a concise framework for classifying the manifold forms of synchrony along six dimensions: periodicity, discreteness, spatial similarity, directionality, leader–follower dynamics, and observability. We also distill the various proposed functions of interpersonal synchrony into four interconnected functions: reducing complexity and improving understanding, accomplishing joint tasks, strengthening social connection, and influencing partners’ behavior. These functions derive from first principles, emerge from each other, and are accomplished by some forms of synchrony more than others. Effective synchrony flexibly adapts to social goals and more synchrony is not always better. Our synthesis offers a shared framework and language for the field, allowing for better cross-context and cross-behavior comparisons, generating new hypotheses, and highlighting future research directions. 
    more » « less
  3. Programming can be an emotional experience, particularly for undergraduate students who are new to computer science. While researchers have interviewed novice programmers about their emotional experiences, it can be difficult to pinpoint the specific emotions that occur during a programming session. In this paper, we argue that electrodermal activity (EDA) sensors, which measure the physiological changes that are indicative of an emotional reaction, can provide a valuable new data source to help study student experiences. We conducted a study with 14 undergraduate students in which we collected EDA data while they worked on a programming problem. This data was then used to cue the participants’ recollections of their emotions during a retrospective interview about the programming experience. Using this methodology, we identified 21 distinct events that triggered student emotions, such as feeling anxiety due to a lack of perceived progress on the problem. We also identified common patterns in EDA data across multiple participants, such as a drop in their physiological reaction after developing a plan, corresponding with a calmer emotional state. These findings provide new information about how students experience programming that can inform research and practice, and also contribute initial evidence of the value of EDA data in supporting studies of emotions while programming. 
    more » « less
  4. Abstract This study focuses on the individual and joint contributions of two nonverbal channels (i.e., face and upper body) in avatar mediated-virtual environments. 140 dyads were randomly assigned to communicate with each other via platforms that differentially activated or deactivated facial and bodily nonverbal cues. The availability of facial expressions had a positive effect on interpersonal outcomes. More specifically, dyads that were able to see their partner’s facial movements mapped onto their avatars liked each other more, formed more accurate impressions about their partners, and described their interaction experiences more positively compared to those unable to see facial movements. However, the latter was only true when their partner’s bodily gestures were also available and not when only facial movements were available. Dyads showed greater nonverbal synchrony when they could see their partner’s bodily and facial movements. This study also employed machine learning to explore whether nonverbal cues could predict interpersonal attraction. These classifiers predicted high and low interpersonal attraction at an accuracy rate of 65%. These findings highlight the relative significance of facial cues compared to bodily cues on interpersonal outcomes in virtual environments and lend insight into the potential of automatically tracked nonverbal cues to predict interpersonal attitudes. 
    more » « less
  5. null (Ed.)
    Emotion regulation can be characterized by different activities that attempt to alter an emotional response, whether behavioral, physiological or neurological. The two most widely adopted strategies, cognitive reappraisal and expressive suppression are explored in this study, specifically in the context of disgust. Study participants (N = 21) experienced disgust via video exposure, and were instructed to either regulate their emotions or express them freely. If regulating, they were required to either cognitively reappraise or suppress their emotional experiences while viewing the videos. Video recordings of the participants' faces were taken during the experiment and electrocardiogram (ECG), electromyography (EMG), and galvanic skin response (GSR) readings were also collected for further analysis. We compared the participants behavioral (facial musculature movements) and physiological (GSR and heart rate) responses as they aimed to alter their emotional responses and computationally determined that when responding to disgust stimuli, the signals recorded during suppression and free expression were very similar, whereas those recorded during cognitive reappraisal were significantly different. Thus, in the context of this study, from a signal analysis perspective, we conclude that emotion regulation via cognitive reappraisal significantly alters participants' physiological responses to disgust, unlike regulation via suppression. 
    more » « less