The overall goal of our research is to develop a system of intelligent multimodal affective pedagogical agents that are effective for different types of learners (Adamo et al., 2021). While most of the research on pedagogical agents tends to focus on the cognitive aspects of online learning and instruction, this project explores the less-studied role of affective (or emotional) factors. We aim to design believable animated agents that can convey realistic, natural emotions through speech, facial expressions, and body gestures and that can react to the students’ detected emotional states with emotional intelligence. Within the context of this goal, the specific objective of the work reported in the paper was to examine the extent to which the agents’ facial micro-expressions affect students’ perception of the agents’ emotions and their naturalness. Micro-expressions are very brief facial expressions that occur when a person either deliberately or unconsciously conceals an emotion being felt (Ekman &Friesen, 1969). Our assumption is that if the animated agents display facial micro expressions in addition to macro expressions, they will convey higher expressive richness and naturalness to the viewer, as “the agents can possess two emotional streams, one based on interaction with the viewer and the other basedmore »
Emotional valence and arousal induced by auditory stimuli among individuals with visual impairment
Despite significant vision loss, humans can still recognize various emotional stimuli via a sense of hearing and express diverse emotional responses, which can be sorted into two dimensions, arousal and valence. Yet, many research studies have been focusing on sighted people, leading to lack of knowledge about emotion perception mechanisms of people with visual impairment. This study aims at advancing knowledge of the degree to which people with visual impairment perceive various emotions – high/low arousal and positive/negative emotions. A total of 30 individuals with visual impairment participated in interviews where they listened to stories of people who became visually impaired, encountered and overcame various challenges, and they were instructed to share their emotions. Participants perceived different kinds and intensities of emotions, depending on their demographic variables such as living alone, loneliness, onset of visual impairment, visual acuity, race/ethnicity, and employment status. The advanced knowledge of emotion perceptions in people with visual impairment is anticipated to contribute toward better designing social supports that can adequately accommodate those with visual impairment.
- Award ID(s):
- Publication Date:
- NSF-PAR ID:
- Journal Name:
- British Journal of Visual Impairment
- Page Range or eLocation-ID:
- Sponsoring Org:
- National Science Foundation
More Like this
Individual differences in spontaneous facial expressions in people with visual impairment and blindnessPeople can visualize their spontaneous and voluntary emotions via facial expressions, which play a critical role in social interactions. However, less is known about mechanisms of spontaneous emotion expressions, especially in adults with visual impairment and blindness. Nineteen adults with visual impairment and blindness participated in interviews where the spontaneous facial expressions were observed and analyzed via the Facial Action Coding System (FACS). We found a set of Action Units, primarily engaged in expressing the spontaneous emotions, which were likely to be affected by participants’ different characteristics. The results of this study could serve as evidence to suggest that adults with visual impairment and blindness show individual differences in spontaneous facial expressions of emotions.
As inborn characteristics, humans possess the ability to judge visual aesthetics, feel the emotions from the environment, and comprehend others’ emotional expressions. Many exciting applications become possible if robots or computers can be empowered with similar capabilities. Modeling aesthetics, evoked emotions, and emotional expressions automatically in unconstrained situations, however, is daunting due to the lack of a full understanding of the relationship between low-level visual content and high-level aesthetics or emotional expressions. With the growing availability of data, it is possible to tackle these problems using machine learning and statistical modeling approaches. In the talk, I provide an overview of our research in the last two decades on data-driven analyses of visual artworks and digital visual content for modeling aesthetics and emotions. First, I discuss our analyses of styles in visual artworks. Art historians have long observed the highly characteristic brushstroke styles of Vincent van Gogh and have relied on discerning these styles for authenticating and dating his works. In our work, we compared van Gogh with his contemporaries by statistically analyzing a massive set of automatically extracted brushstrokes. A novel extraction method is developed by exploiting an integration of edge detection and clustering-based segmentation. Evidence substantiates that van Gogh’smore »
Individual differences in emotional intelligence skills of people with visual impairment and loneliness amid the COVID-19 pandemicIn response to the novel coronavirus (COVID-19) pandemic, public health interventions such as social distancing and stay-at-home orders have widely been implemented, which is anticipated to contribute to reducing the spread of COVID-19. On the contrary, there is a concern that the public health interventions may increase the level of loneliness. Loneliness and social isolation are public health risks, closely associated with serious medical conditions. As COVID-19 is new to us today, little is known about emotional well-being among people with visual impairment during the COVID-19 pandemic. To address the knowledge gap, this study conducted phone interviews with a convenience sample of 31 people with visual impairment. The interview incorporated the University of California, Los Angeles (UCLA) Loneliness Scale (version 3) and the trait meta-mood scale (TMMS) to measure loneliness and emotional intelligence skills, respectively. This study found that people with visual impairment were vulnerable to the feeling of loneliness during the COVID-19 pandemic and showed individual differences in emotional intelligence skills by different degrees of loneliness. Researchers and health professionals should consider offering adequate coping strategies to those with visual impairment amid the COVID-19 pandemic.
Unlike the younger population that uses wearables such as smartwatches for monitoring health on a daily basis, elderly people need assistance in the use of technology and interpreting the data obtained through these smart connected frameworks. The current monitoring systems are primarily designed to monitor the physiological signals on a daily basis. The aim of this proposed research, Easy-Assist, is to help older people to maintain their emotional well-being. This research is focused on developing a wearable affective framework, which can help in detecting the emotions of the user in addition to monitoring their physiological signals. The proposed framework can be used in an automated assisted living environment, where the user's emotional state can be balanced using a haptic-based emotional elicitation system after the user's emotion is recognized, detected and interpreted in real-time. The proposed framework is validated using a fall detection algorithm deployed in a custom-built watch wearable, built using off-the-shelf components and an emotion detection framework built using a single board computer. A dataset of 21700 samples acquired using the proposed framework yielded a maximum efficiency of 97.25%, 96 %, and 94 %, in classifying the state and emotion classes into Alert, Active and Normal classes respectively, usingmore »