skip to main content


Title: Evaluating Audience Engagement of an Immersive Performance on a Virtual Stage
In this paper, we describe a methodology for determining audience engagement designed specifically for stage performances in a virtual space. We use a combination of galvanic skin response data (GSR), self-reported emotional feedback using the positive and negative affect schedule (PANAS), and a think aloud methodology to assess user reaction to the virtual reality experience. We describe a case study that uses the process to explore the role of immersive viewing of a performance by comparing users’ engagement while watching a virtual dance performances on a monitor vs. using an immersive head mounted display (HMD). Results from the study indicate significant differences between the viewing experiences. The process can serve as a potential tool in the development of a VR storytelling experience.  more » « less
Award ID(s):
1559889
NSF-PAR ID:
10184613
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Frameless Labs Symposium
Page Range / eLocation ID:
1 - 15
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Dini, Petre (Ed.)
    The National Academy of Engineering’s “Fourteen Grand Challenges for Engineering in the Twenty-First Century” identifies challenges in science and technology that are both feasible and sustainable to help people and the planet prosper. Four of these challenges are: advance personalized learning, enhance virtual reality, make solar energy affordable and provide access to clean water. In this work, the authors discuss developing of applications using immersive technologies, such as Virtual Reality (VR) and Augmented Reality (AR) and their significance in addressing four of the challenges. The Drinking Water AR mobile application helps users easily locate drinking water sources inside Auburn University (AU) campus, thus providing easy access to clean water. The Sun Path mobile application helps users visualize Sun’s path at any given time and location. Students study Sun path in various fields but often have a hard time visualizing and conceptualizing it, therefore the application can help. Similarly, the application could possibly assist the users in efficient solar panel placement. Architects often study Sun path to evaluate solar panel placement at a particular location. An effective solar panel placement helps optimize degree of efficiency of using the solar energy. The Solar System Oculus Quest VR application enables users in viewing all eight planets and the Sun in the solar system. Planets are simulated to mimic their position, scale, and rotation relative to the Sun. Using the Oculus Quest controllers, disguised as human hands in the scene, users can teleport within the world view, and can get closer to each planet and the Sun to have a better view of the objects and the text associated with the objects. As a result, tailored learning is aided, and Virtual Reality is enhanced. In a camp held virtually, due to Covid-19, K12 students were introduced to the concept and usability of the applications. Likert scales metric was used to assess the efficacy of application usage. The data shows that participants of this camp benefited from an immersive learning experience that allowed for simulation with inclusion of VR and AR. 
    more » « less
  2. This study explores the impact of an immersive VR experience and middle school students’ interest in and engagement with science. Thirteen students completed a VR experience with two components: a virtual laboratory and a game. Afterwards, students were interviewed and asked to describe their experiences. Students consistently reported the VR experience to be enjoyable and engaging. Moreover, the VR experience seemed to trigger a situational interest in science among the students, with some evidence to suggest that this interest could be sustained and developed in the long term. Implications for research and practice are discussed. 
    more » « less
  3. Spatial perspective taking is an essential cognitive ability that enables people to imagine how an object or scene would appear from a perspective different from their current physical viewpoint. This process is fundamental for successful navigation, especially when people utilize navigational aids (e.g., maps) and the information provided is shown from a different perspective. Research on spatial perspective taking is primarily conducted using paper-pencil tasks or computerized figural tasks. However, in daily life, navigation takes place in a three-dimensional (3D) space and involves movement of human bodies through space, and people need to map the perspective indicated by a 2D, top down, external representation to their current 3D surroundings to guide their movements to goal locations. In this study, we developed an immersive viewpoint transformation task (iVTT) using ambulatory virtual reality (VR) technology. In the iVTT, people physically walked to a goal location in a virtual environment, using a first-person perspective, after viewing a map of the same environment from a top-down perspective. Comparing this task with a computerized version of a popular paper-and-pencil perspective taking task (SOT: Spatial Orientation Task), the results indicated that the SOT is highly correlated with angle production error but not distance error in the iVTT. Overall angular error in the iVTT was higher than in the SOT. People utilized intrinsic body axes (front/back axis or left/right axis) similarly in the SOT and the iVTT, although there were some minor differences. These results suggest that the SOT and the iVTT capture common variance and cognitive processes, but are also subject to unique sources of error caused by different cognitive processes. The iVTT provides a new immersive VR paradigm to study perspective taking ability in a space encompassing human bodies, and advances our understanding of perspective taking in the real world. 
    more » « less
  4. Engineering education aims to create a learning environment capable of developing vital engineering skill sets, preparing students to enter the workforce and succeed as future leaders. With all the rapid technological advancements, new engineering challenges continuously emerge, impeding the development of engineering skills. This insufficiency in developing the required skills resulted in high regression rates in students’ GPAs, resulting in industries reporting graduates’ unsatisfactory performance. From a pedagogical perspective, this problem is highly correlated with traditional learning methods that are inadequate for engaging students and improving their learning experience when adopted alone. Accordingly, educators have incorporated new learning methodologies to address the pre-defined problem and enhance the students’ learning experience. However, many of the currently adopted teaching methods still lack the potential to expose students to practical examples, and they are inefficient among engineering students, who tend to be active learners and prefer to use a variety of senses. To address this, our research team proposes integrating the technology of virtual reality (VR) into the laboratory work of engineering technology courses to improve the students’ learning experience and engagement. VR technology, an immersive high-tech media, was adopted to develop an interactive teaching module on hydraulic gripper designs in a VR construction-like environment. The module aims to expose engineering technology students to real-life applications by providing a more visceral experience than screen-based media through the generation of fully computer-simulated environments in which everything is digitized. This work presents the development and implementation of the VR construction lab module and the corresponding gripper designs. The virtual gripper models are developed using Oculus Virtual Reality (OVR) Metrics Tool for Unity, a Steam VR Overlay utility created to make visualizing the desktop in a VR setting simple and intuitive. The execution of the module comprises building the VR environment, designing and importing the gripper models, and creating a user-interface VR environment to visualize and interact with the model (gripper assembly/mechanism testing). Besides the visualization, manipulation, and interaction, the developed VR system allows for additional features like displaying technical information, guiding students throughout the assembly process, and other specialized options. Thus, the developed interactive VR module will serve as a perpetual mutable platform that can be readily adjusted to allow future add-ons to address future educational opportunities. 
    more » « less
  5. Background Sustained engagement is essential for the success of telerehabilitation programs. However, patients’ lack of motivation and adherence could undermine these goals. To overcome this challenge, physical exercises have often been gamified. Building on the advantages of serious games, we propose a citizen science–based approach in which patients perform scientific tasks by using interactive interfaces and help advance scientific causes of their choice. This approach capitalizes on human intellect and benevolence while promoting learning. To further enhance engagement, we propose performing citizen science activities in immersive media, such as virtual reality (VR). Objective This study aims to present a novel methodology to facilitate the remote identification and classification of human movements for the automatic assessment of motor performance in telerehabilitation. The data-driven approach is presented in the context of a citizen science software dedicated to bimanual training in VR. Specifically, users interact with the interface and make contributions to an environmental citizen science project while moving both arms in concert. Methods In all, 9 healthy individuals interacted with the citizen science software by using a commercial VR gaming device. The software included a calibration phase to evaluate the users’ range of motion along the 3 anatomical planes of motion and to adapt the sensitivity of the software’s response to their movements. During calibration, the time series of the users’ movements were recorded by the sensors embedded in the device. We performed principal component analysis to identify salient features of movements and then applied a bagged trees ensemble classifier to classify the movements. Results The classification achieved high performance, reaching 99.9% accuracy. Among the movements, elbow flexion was the most accurately classified movement (99.2%), and horizontal shoulder abduction to the right side of the body was the most misclassified movement (98.8%). Conclusions Coordinated bimanual movements in VR can be classified with high accuracy. Our findings lay the foundation for the development of motion analysis algorithms in VR-mediated telerehabilitation. 
    more » « less