skip to main content


Title: Facial expressions contribute more than body movements to conversational outcomes in avatar-mediated virtual environments
Abstract

This study focuses on the individual and joint contributions of two nonverbal channels (i.e., face and upper body) in avatar mediated-virtual environments. 140 dyads were randomly assigned to communicate with each other via platforms that differentially activated or deactivated facial and bodily nonverbal cues. The availability of facial expressions had a positive effect on interpersonal outcomes. More specifically, dyads that were able to see their partner’s facial movements mapped onto their avatars liked each other more, formed more accurate impressions about their partners, and described their interaction experiences more positively compared to those unable to see facial movements. However, the latter was only true when their partner’s bodily gestures were also available and not when only facial movements were available. Dyads showed greater nonverbal synchrony when they could see their partner’s bodily and facial movements. This study also employed machine learning to explore whether nonverbal cues could predict interpersonal attraction. These classifiers predicted high and low interpersonal attraction at an accuracy rate of 65%. These findings highlight the relative significance of facial cues compared to bodily cues on interpersonal outcomes in virtual environments and lend insight into the potential of automatically tracked nonverbal cues to predict interpersonal attitudes.

 
more » « less
Award ID(s):
1800922
NSF-PAR ID:
10203191
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
Nature Publishing Group
Date Published:
Journal Name:
Scientific Reports
Volume:
10
Issue:
1
ISSN:
2045-2322
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Positive interpersonal relationships require shared understanding along with a sense of rapport. A key facet of rapport is mirroring and convergence of facial expression and body language, known as nonverbal synchrony. We examined nonverbal synchrony in a study of 29 heterosexual romantic couples, in which audio, video, and bracelet accelerometer were recorded during three conversations. We extracted facial expression, body movement, and acoustic-prosodic features to train neural network models that predicted the nonverbal behaviors of one partner from those of the other. Recurrent models (LSTMs) outperformed feed-forward neural networks and other chance baselines. The models learned behaviors encompassing facial responses, speech-related facial movements, and head movement. However, they did not capture fleeting or periodic behaviors, such as nodding, head turning, and hand gestures. Notably, a preliminary analysis of clinical measures showed greater association with our model outputs than correlation of raw signals. We discuss potential uses of these generative models as a research tool to complement current analytical methods along with real-world applications (e.g., as a tool in therapy). 
    more » « less
  2. null (Ed.)
    We show that the human voice has complex acoustic qualities that are directly coupled to peripheral musculoskeletal tensioning of the body, such as subtle wrist movements. In this study, human vocalizers produced a steady-state vocalization while rhythmically moving the wrist or the arm at different tempos. Although listeners could only hear and not see the vocalizer, they were able to completely synchronize their own rhythmic wrist or arm movement with the movement of the vocalizer which they perceived in the voice acoustics. This study corroborates recent evidence suggesting that the human voice is constrained by bodily tensioning affecting the respiratory–vocal system. The current results show that the human voice contains a bodily imprint that is directly informative for the interpersonal perception of another’s dynamic physical states. 
    more » « less
  3. Extended reality (XR) technologies, such as virtual reality (VR) and augmented reality (AR), provide users, their avatars, and embodied agents a shared platform to collaborate in a spatial context. Although traditional face-to-face communication is limited by users’ proximity, meaning that another human’s non-verbal embodied cues become more difficult to perceive the farther one is away from that person, researchers and practitioners have started to look into ways to accentuate or amplify such embodied cues and signals to counteract the effects of distance with XR technologies. In this article, we describe and evaluate the Big Head technique, in which a human’s head in VR/AR is scaled up relative to their distance from the observer as a mechanism for enhancing the visibility of non-verbal facial cues, such as facial expressions or eye gaze. To better understand and explore this technique, we present two complimentary human-subject experiments in this article. In our first experiment, we conducted a VR study with a head-mounted display to understand the impact of increased or decreased head scales on participants’ ability to perceive facial expressions as well as their sense of comfort and feeling of “uncannniness” over distances of up to 10 m. We explored two different scaling methods and compared perceptual thresholds and user preferences. Our second experiment was performed in an outdoor AR environment with an optical see-through head-mounted display. Participants were asked to estimate facial expressions and eye gaze, and identify a virtual human over large distances of 30, 60, and 90 m. In both experiments, our results show significant differences in minimum, maximum, and ideal head scales for different distances and tasks related to perceiving faces, facial expressions, and eye gaze, and we also found that participants were more comfortable with slightly bigger heads at larger distances. We discuss our findings with respect to the technologies used, and we discuss implications and guidelines for practical applications that aim to leverage XR-enhanced facial cues. 
    more » « less
  4. Telehealth technologies play a vital role in delivering quality healthcare to patients regardless of geographic location and health status. Use of telehealth peripherals allow providers a more accurate method of collecting health assessment data from the patient and delivering a more confident and accurate diagnosis, saving not only time and money but creating positive patient outcomes. Advanced Practice Nursing (APN) students should be confident in their ability to diagnose and treat patients through a virtual environment. This pilot simulation was completed to help examine how APN students interacted in a simulation-based education (SBE) experience with and without peripherals, funded by the National Science Foundation’s Future of Work at the Human-Technology Frontier (FW-HTF) program. The SBE experience was created and deployed using the INACSL Healthcare Simulation Standards of Best PracticesTM and vetted by a simulation expert. APN students (N = 24), in their first assessment course, were randomly selected to be either a patient (n = 12) or provider (n = 12) in a telehealth simulation. Student dyads (patient/provider) were randomly placed to complete a scenario with (n = 6 dyads) or without (n = 6 dyads) the use of a peripheral. Students (providers and patients) who completed the SBE experience had an increased confidence level both with and without the use of peripherals. Students evaluated the simulation via the Simulation Effectiveness Tool-Modified (SET-M), and scored their perception of the simulation on a 1 to 5 point Likert Scale. The highest scoring areas were perceived support of learning by the faculty (M=4.6), feeling challenged in decision-making skills (M=4.4), and a better understanding of didactic material (M=4.3). The lowest scoring area was feeling more confident in decision making (M=3.9). We also recorded students’ facial expressions during the task to determine a probability score (0- 100) for expressed basic emotions, and results revealed that students had the highest scores for joy (M = 8.47) and surprise (M = 4.34), followed by disgust (M = 1.43), fear (M = .76), and contempt (M = .64); and had the lowest scores of anger (M = .44) and sadness (M = .36). Students were also asked to complete a reflection assignment as part of the SBE experience. Students reported feeling nervous at the beginning of the SBE experience, but acknowledged feeling better as the SBE experience unfolded. Based on findings from this pilot study, implications point towards the effectiveness of including simulations for nurse practitioner students to increase their confidence in performing telehealth visits and engaging in decision making. For the students, understanding that patients may be just as nervous during telehealth visits was one of the main takeaways from the experience, as well as remembering to reassure the patient and how to ask the patient to work the telehealth equipment. Therefore, providing students opportunities to practice these skills will help increase their confidence, boost their self- and emotion regulation, and improve their decision-making skills in telehealth scenarios. 
    more » « less
  5. “Sickness behavior” is an orchestrated suite of symptoms that commonly occur in the context of inflammation, and is characterized by changes in affect, social experience, and behavior. However, recent evidence suggests that inflammation may not always produce the same set of sickness behavior (e.g., fatigue, anhedonia, and social withdrawal). Rather, inflammation may be linked with different behavior across contexts and/or across in- dividuals, though research in this area is under-developed to-date. In the present study (n = 30), we evaluated the influence of affective context and individual differences in difficulty detecting bodily sensations (i.e., interoceptive difficulty) on social perception following an inflammatory challenge. Inflammation was induced using the influenza vaccine and inflammatory reactivity was operationalized as changes in circulating levels of interleukin-6 (IL-6) before the vaccine and approximately 24 h later. Twenty-four hours after administration of the influenza vaccine, we manipulated affective context using a well-validated affect misattribution task in which participants made trustworthiness judgments of individuals with neutral facial expressions following the rapid presentation of “prime” images that were positive or negative in affective content. Interoceptive difficulty was measured at baseline using a validated self-report measure. Results revealed significant interactions between inflammatory reactivity to the influenza vaccine and affective context on social perception. Specifically, in- dividuals with greater inflammatory reactivity were more biased by affective context when judging the trust- worthiness of neutral faces. In addition, interoceptive difficulty and affective context interacted to predict social perception such that individuals with greater interoceptive difficulty were more biased by affective context in these judgments. In sum, we provide some of the first evidence that inflammation may amplify the saliency of affective cues during social decision making. Our findings also replicate prior work linking interoceptive ability to the use of affect-as-information during social perception, but in the novel context of inflammation. 
    more » « less