skip to main content


Title: Friendship Maintenance Mediates the Relationship Between Compassion for Others and Happiness
Displaying compassion for others (CFO) and utilizing friendshipmaintenance (FM) behaviors are positively associated with happiness. Two studies investigated FM as a mediator of the relationship between CFO and happiness (Study 1: N = 273; Study 2: N = 368). FM mediated the CFO-Happiness relationship in both studies regardless of the way happiness was measured. Although women had higher scores on both CFO and FM, the model was supported for both genders. The implications of the findings are discussed and suggestions for future research are provided.  more » « less
Award ID(s):
1659888
NSF-PAR ID:
10055620
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Current Psychology
ISSN:
1046-1310
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    The present study compares how individuals perceive gradient acoustic realizations of emotion produced by a human voice versus an Amazon Alexa text-to-speech (TTS) voice. We manipulated semantically neutral sentences spoken by both talkers with identical emotional synthesis methods, using three levels of increasing ‘happiness’ (0 %, 33 %, 66% ‘happier’). On each trial, listeners (native speakers of American English, n=99) rated a given sentence on two scales to assess dimensions of emotion: valence (negative-positive) and arousal (calm-excited). Participants also rated the Alexa voice on several parameters to assess anthropomorphism (e.g., naturalness, human-likeness, etc.). Results showed that the emotion manipulations led to increases in perceived positive valence and excitement. Yet, the effect differed by interlocutor: increasing ‘happiness’ manipulations led to larger changes for the human voice than the Alexa voice. Additionally, we observed individual differences in perceived valence/arousal based on participants’ anthropomorphism scores. Overall, this line of research can speak to theories of computer personification and elucidate our changng relationship with voice-AI technology. 
    more » « less
  2. Abstract

    Humans experience emotional benefits from engaging in prosocial behavior. The current work investigates factors that influence the experience of happiness from giving to others in early childhood. In three studies with 5‐year‐olds (N= 144), we find that young children are happier from giving resources to others than from receiving resources for themselves (Study 1) and investigatewhenchildren are most happy from giving. In Study 2, children were happier when they could see the beneficiary's positive reaction, suggesting that empathizing with the beneficiary's positive emotion contributes to happiness (consistent with the concept of vicarious‐joy). In Study 3, children were happier after they gave resources than when they watched someone else give resources, indicating that being responsible for prosocial action contributes to children's happiness (consistent with the concept of warm‐glow). These results provide a critical step toward understanding when children experience happiness from giving and a foundation for investigating happiness as a mechanism supporting early prosociality.

     
    more » « less
  3. Abstract The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient–focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals. 
    more » « less
  4. The expression of human emotion is integral to social interaction, and in virtual reality it is increasingly common to develop virtual avatars that attempt to convey emotions by mimicking these visual and aural cues, i.e. the facial and vocal expressions. However, errors in (or the absence of) facial tracking can result in the rendering of incorrect facial expressions on these virtual avatars. For example, a virtual avatar may speak with a happy or unhappy vocal inflection while their facial expression remains otherwise neutral. In circumstances where there is conflict between the avatar's facial and vocal expressions, it is possible that users will incorrectly interpret the avatar's emotion, which may have unintended consequences in terms of social influence or in terms of the outcome of the interaction. In this paper, we present a human-subjects study (N = 22) aimed at understanding the impact of conflicting facial and vocal emotional expressions. Specifically we explored three levels of emotional valence (unhappy, neutral, and happy) expressed in both visual (facial) and aural (vocal) forms. We also investigate three levels of head scales (down-scaled, accurate, and up-scaled) to evaluate whether head scale affects user interpretation of the conveyed emotion. We find significant effects of different multimodal expressions on happiness and trust perception, while no significant effect was observed for head scales. Evidence from our results suggest that facial expressions have a stronger impact than vocal expressions. Additionally, as the difference between the two expressions increase, the less predictable the multimodal expression becomes. For example, for the happy-looking and happy-sounding multimodal expression, we expect and see high happiness rating and high trust, however if one of the two expressions change, this mismatch makes the expression less predictable. We discuss the relationships, implications, and guidelines for social applications that aim to leverage multimodal social cues. 
    more » « less
  5. Abstract Introduction

    Building on prior evidence that prosocial behavior is related to the regulation of personal distress in difficult situations, and given that physiological regulation is a central contributor to effective emotion regulation, this investigation evaluated whether and how children's autonomic nervous system (ANS) reactivity during emotion challenges influenced later expressions of prosocial behavior.

    Methods

    The current study utilized a diverse sample of school‐aged children (N = 169; 47.9% female; 47.3% Latinx) to evaluate relations between children's parasympathetic (i.e., respiratory sinus arrhythmia; RSA) and sympathetic (i.e., pre‐ejection period; PEP) reactivity in response to each of three film‐elicited emotion challenges (i.e., sadness, happiness, and fear) at age 7 and both observed and parent‐reported prosocial behavior one year later.

    Results

    Children's parasympathetic reactivity to a film eliciting sadness evidenced a nonlinear relation with later prosocial sharing such that children who evidenced either RSA withdrawal or augmentation in response to the sad emotion challenge engaged in higher levels of prosocial behavior than children who evidenced relatively low or absent reactivity. Parasympathetic reactivity to films eliciting happiness or fear was not significantly related to later prosocial behavior. Likewise, children's sympathetic reactivity in response to the emotion challenges did not significantly predict later prosocial behavior.

    Conclusions

    These findings provide preliminary support for a nonlinear association between children's parasympathetic emotion reactivity and later prosocial behavior, and suggest that children's ANS regulation in sad emotion contexts may be particularly important for understanding prosocial development.

     
    more » « less