Abstract: Embodied Code is a visual programming language in virtual reality (VR). It introduces novices to fundamental computing concepts and immersive game engines through hands-on creative coding. Unlike traditional creative coding toolkits, this system harnesses the visuospatial and kinesthetic affordances of VR to engage users in embodied computer science learning. Coders are afforded considerable flexibility in placing, rearranging, and manipulating elements of code (nodes and connectors) and its output such that space and movement can be leveraged as organizational and conceptual scaffolds. Further, assembling nodes and connectors is guided by two simple principles – input versus output and events versus data. These design principles were adopted to foster analogical mappings between physical experiences of working with code and output in an immersive virtual space and perception and action in the real world. Further, they were purposed for exploring different levels of coding abstraction in classroom use.
more »
« less
Insights from Immersive Learning: Using Sentiment Analysis and Real-time Narration to Refine ASL Instruction in Virtual Reality
Immersive virtual reality presents a rich opportunity for learning signed languages, given the immersive environment’s ability to represent three-dimensional information. We developed a proof- of-concept American Sign Language (ASL) learning in immersive virtual reality (VR), named ASL Champ! Twelve hearing non- or novice signers played one full level of the game, during which they were asked to provide concurrent think-aloud (CTA) commentary, narrating their experience as they played in real time. We conducted a sentiment analysis from recordings of the CTA and subsequent open-ended questions and qualitatively assessed the narrations for salient themes. The analysis revealed speci!c aspects of the users’ experiences that were most likely to lead to positive or negative expressions during the CTA and the question session. The factors that had the most impact on user sentiment were the success of the sign recognition in the game and the extent to which users found the game intuitive or self-explanatory. We also found that users with more technology anxiety were more positive about the game. We also qualitatively examined user comments, revealing their real-time game experiences. This work provides insights into which aspects of an ASL learning VR game are most important for user experiences. We conclude with takeaway recommendations for future virtual or augmented reality sign language learning games.
more »
« less
- Award ID(s):
- 2118742
- PAR ID:
- 10660437
- Publisher / Repository:
- ACM
- Date Published:
- Page Range / eLocation ID:
- 1 to 4
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Efthimiou, E.; Fotinea, S-E.; Hanke, T.; McDonald, J.; Shterionov, D.; Wolfe, R. (Ed.)With improved and more easily accessible technology, immersive virtual reality (VR) head-mounted devices have become more ubiquitous. As signing avatar technology improves, virtual reality presents a new and relatively unexplored application for signing avatars. This paper discusses two primary ways that signed language can be represented in immersive virtual spaces: 1) Third-person, in which the VR user sees a character who communicates in signed language; and 2) First-person, in which the VR user produces signed content themselves, tracked by the head-mounted device and visible to the user herself (and/or to other users) in the virtual environment. We will discuss the unique affordances granted by virtual reality and how signing avatars might bring accessibility and new opportunities to virtual spaces. We will then discuss the limitations of signed content in virtual reality concerning virtual signers shown from both third- and first-person perspectives.more » « less
-
null (Ed.)This paper presents a holistic system to scale up the teaching and learning of vocabulary words of American Sign Language (ASL). The system leverages the most recent mixed-reality technology to allow the user to perceive her own hands in an immersive learning environment with first- and third-person views for motion demonstration and practice. Precise motion sensing is used to record and evaluate motion, providing real-time feedback tailored to the specific learner. As part of this evaluation, learner motions are matched to features derived from the Hamburg Notation System (HNS) developed by sign-language linguists. We develop a prototype to evaluate the efficacy of mixed-reality-based interactive motion teaching. Results with 60 participants show a statistically significant improvement in learning ASL signs when using our system, in comparison to traditional desktop-based, non-interactive learning. We expect this approach to ultimately allow teaching and guided practice of thousands of signs.more » « less
-
Guidelines on Successfully Porting Non-Immersive Games to Virtual Reality: A Case Study in MinecraftVirtual reality games have grown rapidly in popularity since the first consumer VR head-mounted displays were released in 2016, however comparatively little research has explored how this new medium impacts the experience of players. In this paper, we present a study exploring how user experience changes when playing Minecraft on the desktop and in immersive virtual reality. Fourteen players completed six 45 minute sessions, three played on the desktop and three in VR. The Gaming Experience Questionnaire, the i-Group presence questionnaire, and the Simulator Sickness Questionnaire were administered after each session, and players were interviewed at the end of the experiment. Participants strongly preferred playing Minecraft in VR, despite frustrations with using teleporation as a travel technique and feelings of simulator sickness. Players enjoyed using motion controls, but still continued to use indirect input under certain circumstances. This did not appear to negatively impact feelings of presence. We conclude with four lessons for game developers interested in porting their games to virtual reality.more » « less
-
We present here a new system, in which signing avatars (computer-animated virtual humans built from motion capture recordings) teach introductory American Sign Language (ASL) in an immersive virtual environment. The system is called Signing Avatars & Immersive Learning (SAIL). The significant contributions of this work are 1) the use of signing avatars, built from state-of-the-art motion capture recordings of a native signer; 2) the integration with LEAP gesture tracking hardware, allowing the user to see his or her own movements within the virtual environment; 3) the development of appropriate introductory ASL vocabulary, delivered in semi-interactive lessons; and 4) the 3D environment in which a user accesses the system.more » « less
An official website of the United States government

