skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Exploring an Affective and Responsive Virtual Environment to Improve Remote Learning
Online classes are typically conducted by using video conferencing software such as Zoom, Microsoft Teams, and Google Meet. Research has identified drawbacks of online learning, such as “Zoom fatigue”, characterized by distractions and lack of engagement. This study presents the CUNY Affective and Responsive Virtual Environment (CARVE) Hub, a novel virtual reality hub that uses a facial emotion classification model to generate emojis for affective and informal responsive interaction in a 3D virtual classroom setting. A web-based machine learning model is employed for facial emotion classification, enabling students to communicate four basic emotions live through automated web camera capture in a virtual classroom without activating their cameras. The experiment is conducted in undergraduate classes on both Zoom and CARVE, and the results of a survey indicate that students have a positive perception of interactions in the proposed virtual classroom compared with Zoom. Correlations between automated emojis and interactions are also observed. This study discusses potential explanations for the improved interactions, including a decrease in pressure on students when they are not showing faces. In addition, video panels in traditional remote classrooms may be useful for communication but not for interaction. Students favor features in virtual reality, such as spatial audio and the ability to move around, with collaboration being identified as the most helpful feature.  more » « less
Award ID(s):
1827505 2131186 1737533
PAR ID:
10440665
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Virtual Worlds
Volume:
2
Issue:
1
ISSN:
2813-2084
Page Range / eLocation ID:
53 to 74
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. BACKGROUND Facial expressions are critical for conveying emotions and facilitating social interaction. Yet, little is known about how accurately sighted individuals recognize emotions facially expressed by people with visual impairments in online communication settings. OBJECTIVE This study aimed to investigate sighted individuals’ ability to understand facial expressions of six basic emotions in people with visual impairments during Zoom calls. It also aimed to examine whether education on facial expressions specific to people with visual impairments would improve emotion recognition accuracy. METHODS Sighted participants viewed video clips of individuals with visual impairments displaying facial expressions. They then identified the emotions displayed. Next, they received an educational session on facial expressions specific to people with visual impairments, addressing unique characteristics and potential misinterpretations. After education, participants viewed another set of video clips and again identified the emotions displayed. RESULTS Before education, participants frequently misidentified emotions. After education, their accuracy in recognizing emotions improved significantly. CONCLUSIONS This study provides evidence that education on facial expressions of people with visual impairments can significantly enhance sighted individuals’ ability to accurately recognize emotions in online settings. This improved accuracy has the potential to foster more inclusive and effective online interactions between people with and without visual disabilities. 
    more » « less
  2. The COVID-19 pandemic led the majority of educational institutions to rapidly shift to primarily conducting courses through online, remote delivery. Across different institutions, the tools used for synchronous online course delivery varied. They included traditional video conferencing tools like Zoom, Google Meet, and WebEx as well as non-traditional tools like Gather.Town, Gatherly, and YoTribe. The main distinguishing characteristic of these nontraditional tools is their utilization of 2-D maps to create virtual meeting spaces that mimic real-world spaces. In this work, we aim to explore how such tools are perceived by students in the context of learning. Our intuition is that utilizing a tool that features a 2-D virtual space that resembles a real world classroom has underlying benefits compared to the more traditional video conferencing tools. The results of our study indicate that students' perception of using a 2-D virtual classroom improved their interaction, collaboration and overall satisfaction with an online learning experience. 
    more » « less
  3. Agents must monitor their partners' affective states continuously in order to understand and engage in social interactions. However, methods for evaluating affect recognition do not account for changes in classification performance that may occur during occlusions or transitions between affective states. This paper addresses temporal patterns in affect classification performance in the context of an infant-robot interaction, where infants’ affective states contribute to their ability to participate in a therapeutic leg movement activity. To support robustness to facial occlusions in video recordings, we trained infant affect recognition classifiers using both facial and body features. Next, we conducted an in-depth analysis of our best-performing models to evaluate how performance changed over time as the models encountered missing data and changing infant affect. During time windows when features were extracted with high confidence, a unimodal model trained on facial features achieved the same optimal performance as multimodal models trained on both facial and body features. However, multimodal models outperformed unimodal models when evaluated on the entire dataset. Additionally, model performance was weakest when predicting an affective state transition and improved after multiple predictions of the same affective state. These findings emphasize the benefits of incorporating body features in continuous affect recognition for infants. Our work highlights the importance of evaluating variability in model performance both over time and in the presence of missing data when applying affect recognition to social interactions. 
    more » « less
  4. Endowing automated agents with the ability to provide support, entertainment and interaction with human beings requires sensing of the users’ affective state. These affective states are impacted by a combination of emotion inducers, current psychological state, and various conversational factors. Although emotion classification in both singular and dyadic settings is an established area, the effects of these additional factors on the production and perception of emotion is understudied. This paper presents a new dataset, Multimodal Stressed Emotion (MuSE), to study the multimodal interplay between the presence of stress and expressions of affect. We describe the data collection protocol, the possible areas of use, and the annotations for the emotional content of the recordings. The paper also presents several baselines to measure the performance of multimodal features for emotion and stress classification. 
    more » « less
  5. Virtual reality offers vast possibilities to enhance the conventional approach for delivering engineering education. The introduction of virtual reality technology into teaching can improve the undergraduate mechanical engineering curriculum by supplementing the traditional learning experience with outside-the-classroom materials. The Center for Aviation and Automotive Technological Education using Virtual E-Schools (CA2VES), in collaboration with the Clemson University Center for Workforce Development (CUCWD), has developed a comprehensive virtual reality-based learning system. The available e-learning materials include eBooks, mini-video lectures, three-dimensional virtual reality technologies, and online assessments. Select VR-based materials were introduced to students in a sophomore level mechanical engineering laboratory course via fourteen online course modules during a four-semester period. To evaluate the material, a comparison of student performance with and without the material, along with instructor feedback, was completed. Feedback from the instructor and the teaching assistant revealed that the material was effective in improving the laboratory safety and boosted student’s confidence in handling engineering tools. 
    more » « less