skip to main content


Title: Use of Scaling to Improve Reach in Virtual Reality for People with Parkinson’s Disease
This research investigates the effect of scaling in virtual reality to improve the reach of users with Parkinson’s disease (PD). People with PD have limited reach, often due to impaired postural stability. We investigated how virtual reality (VR) can improve reach during and after VR exposure. Participants played a VR game where they smashed water balloons thrown at them by crossing their midsection. The distance the balloons were thrown at increased and decreased based on success or failure. Their perception of the distance and their hand were scaled in three counterbalanced conditions: under-scaled (scale = 0:83), not-scaled (scale = 1), and over-scaled (scale = 1:2), where the scale value is the ratio between the virtual reach that they perceive in the virtual environment (VE) and their actual reach. In each study condition, six data were measured - 1. Real World Reach (pre-exposure), 2. Virtual Reality Baseline Reach, 3. Virtual Reality Not-Scaled Reach, 4. Under-Scaled Reach, 5. Over-Scaled Reach, and 6. Real World Reach (post-exposure). Our results show that scaling a person’s movement in virtual reality can help improve reach. Therefore, we recommend including a scaling factor in VR games for people with Parkinson’s disease.  more » « less
Award ID(s):
2104819
NSF-PAR ID:
10358482
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
IEEE International Conference on Serious Games and Applications for Health
ISSN:
2330-5649
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This poster presents the use of Augmented Reality (AR) and Virtual Reality (VR) to tackle 4 amongst the “14 Grand Challenges for Engineering in the 21st Century” identified by National Academy of Engineering. AR and VR are the technologies of the present and the future. AR creates a composite view by adding digital content to a real world view, often by using the camera of a smartphone and VR creates an immersive view where the user’s view is often cut off from the real world. The 14 challenges identify areas of science and technology that are achievable and sustainable to assist people and the planet to prosper. The 4 challenges tackled using AR/VR application in this poster are: Enhance virtual reality, Advance personalized learning, Provide access to clean water, and Make solar energy affordable. The solar system VR application is aimed at tackling two of the engineering challenges: (1) Enhance virtual reality and (2) Advance personalized learning. The VR application assists the user in visualizing and understanding our solar system by using a VR headset. It includes an immersive 360 degree view of our solar system where the user can use controllers to interact with celestial bodies-related information and to teleport to different points in the space to have a closer look at the planets and the Sun. The user has six degrees of freedom. The AR application for water tackles the engineering challenge: “Provide access to clean water”. The AR water application shows information on drinking water accessibility and the eco-friendly usage of bottles over plastic cups within the department buildings inside Auburn University. The user of the application has an augmented view of drinking water information on a smartphone. Every time the user points the smartphone camera towards a building, the application will render a composite view with drinking water information associated to the building. The Sun path visualization AR application tackles the engineering challenge: “Make solar energy affordable”. The application helps the user visualize sun path at a selected time and location. The sun path is augmented in the camera view of the device when the user points the camera towards the sky. The application provides information on sun altitude and azimuth. Also, it provides the user with sunrise and sunset data for a selected day. The information provided by the application can aid the user with effective solar panel placement. Using AR and VR technology to tackle these challenges enhances the user experience. The information from these applications are better curated and easily visualized, thus readily understandable by the end user. Therefore, usage of AR and VR technology to tackle these type of engineering challenges looks promising. 
    more » « less
  2. Extended reality (XR) technologies, such as virtual reality (VR) and augmented reality (AR), provide users, their avatars, and embodied agents a shared platform to collaborate in a spatial context. Although traditional face-to-face communication is limited by users’ proximity, meaning that another human’s non-verbal embodied cues become more difficult to perceive the farther one is away from that person, researchers and practitioners have started to look into ways to accentuate or amplify such embodied cues and signals to counteract the effects of distance with XR technologies. In this article, we describe and evaluate the Big Head technique, in which a human’s head in VR/AR is scaled up relative to their distance from the observer as a mechanism for enhancing the visibility of non-verbal facial cues, such as facial expressions or eye gaze. To better understand and explore this technique, we present two complimentary human-subject experiments in this article. In our first experiment, we conducted a VR study with a head-mounted display to understand the impact of increased or decreased head scales on participants’ ability to perceive facial expressions as well as their sense of comfort and feeling of “uncannniness” over distances of up to 10 m. We explored two different scaling methods and compared perceptual thresholds and user preferences. Our second experiment was performed in an outdoor AR environment with an optical see-through head-mounted display. Participants were asked to estimate facial expressions and eye gaze, and identify a virtual human over large distances of 30, 60, and 90 m. In both experiments, our results show significant differences in minimum, maximum, and ideal head scales for different distances and tasks related to perceiving faces, facial expressions, and eye gaze, and we also found that participants were more comfortable with slightly bigger heads at larger distances. We discuss our findings with respect to the technologies used, and we discuss implications and guidelines for practical applications that aim to leverage XR-enhanced facial cues. 
    more » « less
  3. Virtual Reality (VR) telepresence platforms are being challenged to support live performances, sporting events, and conferences with thousands of users across seamless virtual worlds. Current systems have struggled to meet these demands which has led to high-profile performance events with groups of users isolated in parallel sessions. The core difference in scaling VR environments compared to classic 2D video content delivery comes from the dynamic peer-to-peer spatial dependence on communication. Users have many pair-wise interactions that grow and shrink as they explore spaces. In this paper, we discuss the challenges of VR scaling and present an architecture that supports hundreds of users with spatial audio and video in a single virtual environment. We leverage the property of \textit{spatial locality} with two key optimizations: (1) a Quality of Service (QoS) scheme to prioritize audio and video traffic based on users' locality, and (2) a resource manager that allocates client connections across multiple servers based on user proximity within the virtual world. Through real-world deployments and extensive evaluations under real and simulated environments, we demonstrate the scalability of our platform while showing improved QoS compared with existing approaches. 
    more » « less
  4. Spatial perspective taking is an essential cognitive ability that enables people to imagine how an object or scene would appear from a perspective different from their current physical viewpoint. This process is fundamental for successful navigation, especially when people utilize navigational aids (e.g., maps) and the information provided is shown from a different perspective. Research on spatial perspective taking is primarily conducted using paper-pencil tasks or computerized figural tasks. However, in daily life, navigation takes place in a three-dimensional (3D) space and involves movement of human bodies through space, and people need to map the perspective indicated by a 2D, top down, external representation to their current 3D surroundings to guide their movements to goal locations. In this study, we developed an immersive viewpoint transformation task (iVTT) using ambulatory virtual reality (VR) technology. In the iVTT, people physically walked to a goal location in a virtual environment, using a first-person perspective, after viewing a map of the same environment from a top-down perspective. Comparing this task with a computerized version of a popular paper-and-pencil perspective taking task (SOT: Spatial Orientation Task), the results indicated that the SOT is highly correlated with angle production error but not distance error in the iVTT. Overall angular error in the iVTT was higher than in the SOT. People utilized intrinsic body axes (front/back axis or left/right axis) similarly in the SOT and the iVTT, although there were some minor differences. These results suggest that the SOT and the iVTT capture common variance and cognitive processes, but are also subject to unique sources of error caused by different cognitive processes. The iVTT provides a new immersive VR paradigm to study perspective taking ability in a space encompassing human bodies, and advances our understanding of perspective taking in the real world. 
    more » « less
  5. null (Ed.)
    Like many natural sciences, a critical component of archaeology is field work. Despite its importance, field opportunities are available to few students for financial and logistical reasons. With little exposure to archaeological research, fewer students are entering archaeology, particularly minority students (Smith 2004; Wilson 2015). To counter these trends, we have leveraged the ongoing revolution in consumer electronics for the current, digitally-empowered generation by creating a game-based, virtual archaeology curriculum to 1) teach foundational principles of a discipline that is challenging to present in a traditional classroom by using sensory and cognitive immersion; and, 2) allow wider access to a field science that has previously been limited to only select students. Virtual reality (VR) is computer technology that creates a simulated three-dimensional world for a user to experience in a bodily way, thereby transforming data analysis into a sensory and cognitive experience. Using a widely-available, room-scale, VR platform, we have created a virtual archaeological excavation experience that conveys two overarching classroom objectives: 1) teach the physical methods of archaeological excavation by providing the setting and tools for a student to actively engage in field work; and, 2) teach archaeological concepts using a scientific approach to problem solving by couching them within a role-playing game. The current prototype was developed with the HTC Vive VR platform, which includes a headset, hand controllers, and two base stations to track the position and orientation of the user’s head and hands within a 4x4 meter area. Environments were developed using Unreal Engine 4, an open source gaming engine, to maximize usability for different audiences, learning objectives, and skill levels. Given the inherent fun of games and widespread interest in archaeology and cultural heritage, the results of this research are adaptable and applicable to learners of all ages in formal and informal educational settings. 
    more » « less