Many people including those with visual impairment and blindness take advantage of video conferencing tools to meet people. Video conferencing tools enable them to share facial expressions that are considered as one of the most important aspects of human communication. This study aims to advance knowledge of how those with visual impairment and blindness share their facial expressions of emotions virtually. This study invited a convenience sample of 28 adults with visual impairment and blindness to Zoom video conferencing. The participants were instructed to pose facial expressions of basic human emotions (anger, fear, disgust, happiness, surprise, neutrality, calmness, and sadness), which were video recorded. The facial expressions were analyzed using the Facial Action Coding System (FACS) that encodes the movement of specific facial muscles called Action Units (AUs). This study found that there was a particular set of AUs significantly engaged in expressing each emotion, except for sadness. Individual differences were also found in AUs influenced by the participants’ visual acuity levels and emotional characteristics such as valence and arousal levels. The research findings are anticipated to serve as the foundation of knowledge, contributing to developing emotion-sensing technologies for those with visual impairment and blindness.
- Award ID(s):
- 1846076
- Publication Date:
- NSF-PAR ID:
- 10214020
- Journal Name:
- 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020)
- Page Range or eLocation-ID:
- 567 to 571
- Sponsoring Org:
- National Science Foundation
More Like this
-
Facial expressions of emotions by people with visual impairment and blindness via video conferencing
-
The overall goal of our research is to develop a system of intelligent multimodal affective pedagogical agents that are effective for different types of learners (Adamo et al., 2021). While most of the research on pedagogical agents tends to focus on the cognitive aspects of online learning and instruction, this project explores the less-studied role of affective (or emotional) factors. We aim to design believable animated agents that can convey realistic, natural emotions through speech, facial expressions, and body gestures and that can react to the students’ detected emotional states with emotional intelligence. Within the context of this goal, the specific objective of the work reported in the paper was to examine the extent to which the agents’ facial micro-expressions affect students’ perception of the agents’ emotions and their naturalness. Micro-expressions are very brief facial expressions that occur when a person either deliberately or unconsciously conceals an emotion being felt (Ekman &Friesen, 1969). Our assumption is that if the animated agents display facial micro expressions in addition to macro expressions, they will convey higher expressive richness and naturalness to the viewer, as “the agents can possess two emotional streams, one based on interaction with the viewer and the other basedmore »
-
Telehealth technologies play a vital role in delivering quality healthcare to patients regardless of geographic location and health status. Use of telehealth peripherals allow providers a more accurate method of collecting health assessment data from the patient and delivering a more confident and accurate diagnosis, saving not only time and money but creating positive patient outcomes. Advanced Practice Nursing (APN) students should be confident in their ability to diagnose and treat patients through a virtual environment. This pilot simulation was completed to help examine how APN students interacted in a simulation-based education (SBE) experience with and without peripherals, funded by the National Science Foundation’s Future of Work at the Human-Technology Frontier (FW-HTF) program. The SBE experience was created and deployed using the INACSL Healthcare Simulation Standards of Best PracticesTM and vetted by a simulation expert. APN students (N = 24), in their first assessment course, were randomly selected to be either a patient (n = 12) or provider (n = 12) in a telehealth simulation. Student dyads (patient/provider) were randomly placed to complete a scenario with (n = 6 dyads) or without (n = 6 dyads) the use of a peripheral. Students (providers and patients) who completed the SBE experiencemore »
-
Abstract Higher reactivity to stress exposure is associated with an increased tendency to appraise ambiguous stimuli as negative. However, it remains unknown whether tendencies to use emotion regulation strategies—such as cognitive reappraisal, which involves altering the meaning or relevance of affective stimuli—can shape individual differences regarding how stress affects perceptions of ambiguity. Here, we examined whether increased reappraisal use is one factor that can determine whether stress exposure induces increased negativity bias. In Study 1, healthy participants (
n = 43) rated the valence of emotionally ambiguous (surprised) faces before and after an acute stress or control manipulation and reported reappraisal habits. Increased negativity ratings were milder for stressed individuals that reported more habitual reappraisal use. In Study 2 (n = 97), we extended this investigation to real-world perceived stress before and during the COVID-19 pandemic. We found that reappraisal tendency moderates the relationship between perceived stress and increased negativity bias. Collectively, these findings suggest that the propensity to reappraise determines negativity bias when evaluating ambiguity under stress. -
Stephanidis C., Antona M. (Ed.)The objective of this study is to develop and use a virtual reality game as a tool to assess the effects of realistic stress on the behavioral and physiological responses of participants. The game is based on a popular Steam game called Keep Talking Nobody Explodes, where the players collaborate to defuse a bomb. Varying levels of difficulties in solving a puzzle and time pressures will result in different stress levels that can be measured in terms of errors, response times, and other physiological measurements. The game was developed using 3D programming tools including Blender and a virtual reality development kit (VRTK). To measure response times accurately, we added LSL (Lab Stream Layer) Markers to collect and synchronize physiological signals, behavioral data, and the timing of game events. We recorded Electrocardiogram (ECG) data during gameplay to assess heart rate and heart-rate variability (HRV) that have been shown as reliable indicators of stress. Our empirical results showed that heart rate increased significantly while HRV reduced significantly when the participants under high stress, which are consistent with the prior mainstream stress research. This VR game framework is publicly available in GitHub and allows researchers to measure and synchronize other physiological signals suchmore »