skip to main content

Title: A Computational View of the Emotional Regulation of Disgust using Multimodal Sensors
Emotion regulation can be characterized by different activities that attempt to alter an emotional response, whether behavioral, physiological or neurological. The two most widely adopted strategies, cognitive reappraisal and expressive suppression are explored in this study, specifically in the context of disgust. Study participants (N = 21) experienced disgust via video exposure, and were instructed to either regulate their emotions or express them freely. If regulating, they were required to either cognitively reappraise or suppress their emotional experiences while viewing the videos. Video recordings of the participants' faces were taken during the experiment and electrocardiogram (ECG), electromyography (EMG), and galvanic skin response (GSR) readings were also collected for further analysis. We compared the participants behavioral (facial musculature movements) and physiological (GSR and heart rate) responses as they aimed to alter their emotional responses and computationally determined that when responding to disgust stimuli, the signals recorded during suppression and free expression were very similar, whereas those recorded during cognitive reappraisal were significantly different. Thus, in the context of this study, from a signal analysis perspective, we conclude that emotion regulation via cognitive reappraisal significantly alters participants' physiological responses to disgust, unlike regulation via suppression.
Authors:
; ;
Award ID(s):
1846076
Publication Date:
NSF-PAR ID:
10214020
Journal Name:
15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020)
Page Range or eLocation-ID:
567 to 571
Sponsoring Org:
National Science Foundation
More Like this
  1. Many people including those with visual impairment and blindness take advantage of video conferencing tools to meet people. Video conferencing tools enable them to share facial expressions that are considered as one of the most important aspects of human communication. This study aims to advance knowledge of how those with visual impairment and blindness share their facial expressions of emotions virtually. This study invited a convenience sample of 28 adults with visual impairment and blindness to Zoom video conferencing. The participants were instructed to pose facial expressions of basic human emotions (anger, fear, disgust, happiness, surprise, neutrality, calmness, and sadness), which were video recorded. The facial expressions were analyzed using the Facial Action Coding System (FACS) that encodes the movement of specific facial muscles called Action Units (AUs). This study found that there was a particular set of AUs significantly engaged in expressing each emotion, except for sadness. Individual differences were also found in AUs influenced by the participants’ visual acuity levels and emotional characteristics such as valence and arousal levels. The research findings are anticipated to serve as the foundation of knowledge, contributing to developing emotion-sensing technologies for those with visual impairment and blindness.

  2. The overall goal of our research is to develop a system of intelligent multimodal affective pedagogical agents that are effective for different types of learners (Adamo et al., 2021). While most of the research on pedagogical agents tends to focus on the cognitive aspects of online learning and instruction, this project explores the less-studied role of affective (or emotional) factors. We aim to design believable animated agents that can convey realistic, natural emotions through speech, facial expressions, and body gestures and that can react to the students’ detected emotional states with emotional intelligence. Within the context of this goal, the specific objective of the work reported in the paper was to examine the extent to which the agents’ facial micro-expressions affect students’ perception of the agents’ emotions and their naturalness. Micro-expressions are very brief facial expressions that occur when a person either deliberately or unconsciously conceals an emotion being felt (Ekman &Friesen, 1969). Our assumption is that if the animated agents display facial micro expressions in addition to macro expressions, they will convey higher expressive richness and naturalness to the viewer, as “the agents can possess two emotional streams, one based on interaction with the viewer and the other basedmore »on their own internal state, or situation” (Queiroz et al. 2014, p.2).The work reported in the paper involved two studies with human subjects. The objectives of the first study were to examine whether people can recognize micro-expressions (in isolation) in animated agents, and whether there are differences in recognition based on the agent’s visual style (e.g., stylized versus realistic). The objectives of the second study were to investigate whether people can recognize the animated agents’ micro-expressions when integrated with macro-expressions, the extent to which the presence of micro + macro-expressions affect the perceived expressivity and naturalness of the animated agents, the extent to which exaggerating the micro expressions, e.g. increasing the amplitude of the animated facial displacements affects emotion recognition and perceived agent naturalness and emotional expressivity, and whether there are differences based on the agent’s design characteristics. In the first study, 15 participants watched eight micro-expression animations representing four different emotions (happy, sad, fear, surprised). Four animations featured a stylized agent and four a realistic agent. For each animation, subjects were asked to identify the agent’s emotion conveyed by the micro-expression. In the second study, 234 participants watched three sets of eight animation clips (24 clips in total, 12 clips per agent). Four animations for each agent featured the character performing macro-expressions only, four animations for each agent featured the character performing macro- + micro-expressions without exaggeration, and four animations for each agent featured the agent performing macro + micro-expressions with exaggeration. Participants were asked to recognize the true emotion of the agent and rate the emotional expressivity ad naturalness of the agent in each clip using a 5-point Likert scale. We have collected all the data and completed the statistical analysis. Findings and discussion, implications for research and practice, and suggestions for future work will be reported in the full paper. ReferencesAdamo N., Benes, B., Mayer, R., Lei, X., Meyer, Z., &Lawson, A. (2021). Multimodal Affective Pedagogical Agents for Different Types of Learners. In: Russo D., Ahram T., Karwowski W., Di Bucchianico G., Taiar R. (eds) Intelligent Human Systems Integration 2021. IHSI 2021. Advances in Intelligent Systems and Computing, 1322. Springer, Cham. https://doi.org/10.1007/978-3-030-68017-6_33Ekman, P., &Friesen, W. V. (1969, February). Nonverbal leakage and clues to deception. Psychiatry, 32(1), 88–106. https://doi.org/10.1080/00332747.1969.11023575 Queiroz, R. B., Musse, S. R., &Badler, N. I. (2014). Investigating Macroexpressions and Microexpressions in Computer Graphics Animated Faces. Presence, 23(2), 191-208. http://dx.doi.org/10.1162/

    « less
  3. Telehealth technologies play a vital role in delivering quality healthcare to patients regardless of geographic location and health status. Use of telehealth peripherals allow providers a more accurate method of collecting health assessment data from the patient and delivering a more confident and accurate diagnosis, saving not only time and money but creating positive patient outcomes. Advanced Practice Nursing (APN) students should be confident in their ability to diagnose and treat patients through a virtual environment. This pilot simulation was completed to help examine how APN students interacted in a simulation-based education (SBE) experience with and without peripherals, funded by the National Science Foundation’s Future of Work at the Human-Technology Frontier (FW-HTF) program. The SBE experience was created and deployed using the INACSL Healthcare Simulation Standards of Best PracticesTM and vetted by a simulation expert. APN students (N = 24), in their first assessment course, were randomly selected to be either a patient (n = 12) or provider (n = 12) in a telehealth simulation. Student dyads (patient/provider) were randomly placed to complete a scenario with (n = 6 dyads) or without (n = 6 dyads) the use of a peripheral. Students (providers and patients) who completed the SBE experiencemore »had an increased confidence level both with and without the use of peripherals. Students evaluated the simulation via the Simulation Effectiveness Tool-Modified (SET-M), and scored their perception of the simulation on a 1 to 5 point Likert Scale. The highest scoring areas were perceived support of learning by the faculty (M=4.6), feeling challenged in decision-making skills (M=4.4), and a better understanding of didactic material (M=4.3). The lowest scoring area was feeling more confident in decision making (M=3.9). We also recorded students’ facial expressions during the task to determine a probability score (0- 100) for expressed basic emotions, and results revealed that students had the highest scores for joy (M = 8.47) and surprise (M = 4.34), followed by disgust (M = 1.43), fear (M = .76), and contempt (M = .64); and had the lowest scores of anger (M = .44) and sadness (M = .36). Students were also asked to complete a reflection assignment as part of the SBE experience. Students reported feeling nervous at the beginning of the SBE experience, but acknowledged feeling better as the SBE experience unfolded. Based on findings from this pilot study, implications point towards the effectiveness of including simulations for nurse practitioner students to increase their confidence in performing telehealth visits and engaging in decision making. For the students, understanding that patients may be just as nervous during telehealth visits was one of the main takeaways from the experience, as well as remembering to reassure the patient and how to ask the patient to work the telehealth equipment. Therefore, providing students opportunities to practice these skills will help increase their confidence, boost their self- and emotion regulation, and improve their decision-making skills in telehealth scenarios.« less
  4. Abstract

    Higher reactivity to stress exposure is associated with an increased tendency to appraise ambiguous stimuli as negative. However, it remains unknown whether tendencies to use emotion regulation strategies—such as cognitive reappraisal, which involves altering the meaning or relevance of affective stimuli—can shape individual differences regarding how stress affects perceptions of ambiguity. Here, we examined whether increased reappraisal use is one factor that can determine whether stress exposure induces increased negativity bias. In Study 1, healthy participants (n = 43) rated the valence of emotionally ambiguous (surprised) faces before and after an acute stress or control manipulation and reported reappraisal habits. Increased negativity ratings were milder for stressed individuals that reported more habitual reappraisal use. In Study 2 (n = 97), we extended this investigation to real-world perceived stress before and during the COVID-19 pandemic. We found that reappraisal tendency moderates the relationship between perceived stress and increased negativity bias. Collectively, these findings suggest that the propensity to reappraise determines negativity bias when evaluating ambiguity under stress.

  5. Stephanidis C., Antona M. (Ed.)
    The objective of this study is to develop and use a virtual reality game as a tool to assess the effects of realistic stress on the behavioral and physiological responses of participants. The game is based on a popular Steam game called Keep Talking Nobody Explodes, where the players collaborate to defuse a bomb. Varying levels of difficulties in solving a puzzle and time pressures will result in different stress levels that can be measured in terms of errors, response times, and other physiological measurements. The game was developed using 3D programming tools including Blender and a virtual reality development kit (VRTK). To measure response times accurately, we added LSL (Lab Stream Layer) Markers to collect and synchronize physiological signals, behavioral data, and the timing of game events. We recorded Electrocardiogram (ECG) data during gameplay to assess heart rate and heart-rate variability (HRV) that have been shown as reliable indicators of stress. Our empirical results showed that heart rate increased significantly while HRV reduced significantly when the participants under high stress, which are consistent with the prior mainstream stress research. This VR game framework is publicly available in GitHub and allows researchers to measure and synchronize other physiological signals suchmore »as electroencephalogram, electromyogram, and pupillometry.« less