Abstract People may experience emotions before interacting with automated agents to seek information and support. However, existing literature has not well examined how human emotional states affect their interaction experience with agents or how automated agents should react to emotions. This study proposes to test how participants perceive an empathetic agent (chatbot) vs. a non-empathetic one under various emotional states (i.e., positive, neutral, negative) when the chatbot mediates the initial screening process for student advising. Participants are prompted to recall a previous emotional experience and have text-based conversations with the chatbot. The study confirms the importance of presenting empathetic cues in the design of automated agents to support human-agent collaboration. Participants who recall a positive experience are more sensitive to the chatbot’s empathetic behavior. The empathetic behavior of the chatbot improves participants’ satisfaction and makes those who recall a neutral experience feel more positive during the interaction. The results reveal that participants’ emotional states are likely to influence their tendency to self-disclose, interaction experience, and perception of the chatbot’s empathetic behavior. The study also highlights the increasing need for emotional acknowledgment of people who experience positive emotions so that design efforts need to be designated according to people’s dynamic emotional states.
more »
« less
Modeling EEG Dynamics of Self-Imagery Emotions: a Pilot Study
Electroencephalography (EEG)-based emotion classification has drawn increasing attention yet EEG signals associated with emotional responses are still elusive. This study applies a multi-model adaptive mixture independent component analysis (AMICA) as an unsupervised approach to identify and characterize emotional states. Empirical results showed that the AMICA was able to learn distinct models that accounted for four self-imagery emotions. While large-scale analyses and careful examinations are needed, the pilot study offers evidence for AMICA as a promising, data-driven approach to model EEG dynamics of self-imagery emotions.
more »
« less
- Award ID(s):
- 1719130
- PAR ID:
- 10107955
- Date Published:
- Journal Name:
- International IEEE/EMBS Conference on Neural Engineering
- ISSN:
- 1948-3554
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Most of the research in the field of affective computing has focused on detecting and classifying human emotions through electroencephalogram (EEG) or facial expressions. Designing multimedia content to evoke certain emotions has been largely motivated by manual rating provided by users. Here we present insights from the correlation of affective features between three modalities namely, affective multimedia content, EEG, and facial expressions. Interestingly, low-level Audio-visual features such as contrast and homogeneity of the video and tone of the audio in the movie clips are most correlated with changes in facial expressions and EEG. We also detect the regions associated with the human face and the brain (in addition to the EEG frequency bands) that are most representative of affective responses. The computational modeling between the three modalities showed a high correlation between features from these regions and user-reported affective labels. Finally, the correlation between different layers of convolutional neural networks with EEG and Face images as input provides insights into human affection. Together, these findings will assist in (1) designing more effective multimedia contents to engage or influence the viewers, (2) understanding the brain/body bio-markers of affection, and (3) developing newer brain-computer interfaces as well as facial-expression-based algorithms to read emotional responses of the viewers.more » « less
-
na (Ed.)The problem of predicting emotional attributes from speech has often focused on predicting a single value from a sentence or short speaking turn. These methods often ignore that natural emotions are both dynamic and dependent on context. To model the dynamic nature of emotions, we can treat the prediction of emotion from speech as a time-series problem. We refer to the problem of predicting these emotional traces as dynamic speech emotion recognition. Previous studies in this area have used models that treat all emotional traces as coming from the same underlying distribution. Since emotions are dependent on contextual information, these methods might obscure the context of an emotional interaction. This paper uses a neural process model with a segment-level speech emotion recognition (SER) model for this problem. This type of model leverages information from the time-series and predictions from the SER model to learn a prior that defines a distribution over emotional traces. Our proposed model performs 21% better than a bidirectional long short-term memory (BiLSTM) baseline when predicting emotional traces for valence.more » « less
-
Accessibility efforts for d/Deaf and hard of hearing (DHH) learners in video-based learning have mainly focused on captions and interpreters, with limited attention to learners' emotional awareness--an important yet challenging skill for effective learning. Current emotion technologies are designed to support learners' emotional awareness and social needs; however, little is known about whether and how DHH learners could benefit from these technologies. Our study explores how DHH learners perceive and use emotion data from two collection approaches, self-reported and automatic emotion recognition (AER), in video-based learning. By comparing the use of these technologies between DHH (N=20) and hearing learners (N=20), we identified key differences in their usage and perceptions: 1) DHH learners enhanced their emotional awareness by rewatching the video to self-report their emotions and called for alternative methods for self-reporting emotion, such as using sign language or expressive emoji designs; and 2) while the AER technology could be useful for detecting emotional patterns in learning experiences, DHH learners expressed more concerns about the accuracy and intrusiveness of the AER data. Our findings provide novel design implications for improving the inclusiveness of emotion technologies to support DHH learners, such as leveraging DHH peer learners' emotions to elicit reflections.more » « less
-
Purpose – This study aims to compare the use of disgust and sadness – two negative emotions associated with different appraisals and information processing styles – in charity social marketing appeals. Design/methodology/approach – An experiment (n= 247) examined effects when disgust or sad imagery was used alone versus when images were accompanied by information about the cause. OLS regression results show including information reduced empathy when participants were exposed to sad images, replicating prior research on sadness in charity marketing. No similar effect was observed for disgust-evoking images. Although disgust images alone reduced empathy compared to sad images alone, disgust images paired with information were just as effective as sad images alone and sad images accompanied by information. Empathy mediated the relationship between exposure to each type of appeal and donations – this relationship was negative for sad images but not for disgust images accompanied by information. Research limitations/implications – These findings suggest the use of disgust may help to mitigate the loss of empathy that occurs when individuals engage in deliberative tasks, such as reading information about a cause. They also illustrate how the distinct properties of discrete emotions can be used strategically to influence social marketing outcomes. Originality/value – Existing research has compared disgust-evoking images to appeals using neutral, mildly disgusting or positive emotional imagery. This study compares disgust to sadness, a negative emotion commonly used in charity marketing, and considers interaction effects with informational elements of the appeal.more » « less
An official website of the United States government

