skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Comparing the Fidelity of Contemporary Pointing with Controller Interactions on Performance of Personal Space Target Selection
The goal of this research is to provide much needed empirical data on how the fidelity of popular hand gesture tracked based pointing metaphors versus commodity controller based input affects the efficiency and speed-accuracy tradeoff in users’ spatial selection in personal space interactions in VR. We conduct two experiments in which participants select spherical targets arranged in a circle in personal space, or near-field within their maximum arms reach distance, in VR. Both experiments required participants to select the targets with either a VR controller or with their dominant hand’s index finger, which was tracked with one of two popular contemporary tracking methods. In the first experiment, the targets are arranged in a flat circle in accordance with the ISO 9241-9 Fitts’ law standard, and the simulation selected random combinations of 3 target amplitudes and 3 target widths. Targets were placed centered around the users’ eye level, and the arrangement was placed at either 60%, 75%, or 90% depth plane of the users’ maximum arm’s reach. In experiment 2, the targets varied in depth randomly from one depth plane to another within the same configuration of 13 targets within a trial set, which resembled button selection task in hierarchical menus in differing depth planes in the near-field. The study was conducted using the HTC Vive head-mounted display, and used either a VR controller (HTC Vive), low-fidelity virtual pointing (Leap Motion), or a high-fidelity virtual pointing (tracked VR glove) conditions. Our results revealed that low-fidelity pointing performed worse than both high-fidelity pointing and the VR controller. Overall, target selection performance was found to be worse in depth planes closer to the maximum arms reach, as compared to middle and nearer distances.  more » « less
Award ID(s):
2007435
PAR ID:
10437595
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
Page Range / eLocation ID:
404 to 413
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This is one of the first accounts for the security analysis of consumer immersive Virtual Reality (VR) systems. This work breaks new ground, coins new terms, and constructs proof of concept implementations of attacks related to immersive VR. Our work used the two most widely adopted immersive VR systems, the HTC Vive, and the Oculus Rift. More specifically, we were able to create attacks that can potentially disorient users, turn their Head Mounted Display (HMD) camera on without their knowledge, overlay images in their field of vision, and modify VR environmental factors that force them into hitting physical objects and walls. Finally, we illustrate through a human participant deception study the success of being able to exploit VR systems to control immersed users and move them to a location in physical space without their knowledge. We term this the Human Joystick Attack. We conclude our work with future research directions and ways to enhance the security of these systems. 
    more » « less
  2. Emerging Virtual Reality (VR) displays with embedded eye trackers are currently becoming a commodity hardware (e.g., HTC Vive Pro Eye). Eye-tracking data can be utilized for several purposes, including gaze monitoring, privacy protection, and user authentication/identification. Identifying users is an integral part of many applications due to security and privacy concerns. In this paper, we explore methods and eye-tracking features that can be used to identify users. Prior VR researchers explored machine learning on motion-based data (such as body motion, head tracking, eye tracking, and hand tracking data) to identify users. Such systems usually require an explicit VR task and many features to train the machine learning model for user identification. We propose a system to identify users utilizing minimal eye-gaze-based features without designing any identification-specific tasks. We collected gaze data from an educational VR application and tested our system with two machine learning (ML) models, random forest (RF) and k-nearest-neighbors (kNN), and two deep learning (DL) models: convolutional neural networks (CNN) and long short-term memory (LSTM). Our results show that ML and DL models could identify users with over 98% accuracy with only six simple eye-gaze features. We discuss our results, their implications on security and privacy, and the limitations of our work. 
    more » « less
  3. Early researchers applied visualization techniques based on smoke and dye injections in order to describe coherent structures in turbulent flows. Generally speaking, visualization techniques have substantially evolved in the last few decades, spanning all disciplines. In recent times, Virtual Reality (VR) has revolutionized the way that visualization is carried out. In this study, we are performing fully immersive visualization of high-fidelity numerical results of supersonic spatially-developing turbulent boundary layers (SDTBL) under strong concave and concave curvatures and Mach = 2.86. The selected numerical tool is Direct Numerical Simulation (DNS) with high spatial/temporal resolution. The comprehensive DNS information sheds important light on the transport phenomena inside turbulent boundary layers subject to strong deceleration or Adverse Pressure Gradient (APG) caused by concave walls as well as to strong acceleration or Favorable Pressure Gradient (FPG) caused by convex walls at different wall thermal conditions (i.e., cold, adiabatic and hot walls). Another fluid dynamics example to be discussed is the high-speed crossflow-jet problem. We are extracting vortex core iso-surfaces via the Q-criterion to convert them to a file format readable by the HTC Vive VR and Varjo toolkit. Amidst the backdrop of cutting-edge progressions in both capabilities and User Interface (UI) enhancements of the VWT, researchers are now poised to delve into a realm of comprehensive understanding concerning SDTBL. Within this dynamic, fully immersive environment, the intricacies of flow development unfold before their eyes. The elevated UI refinements have bestowed users with remarkable freedom of movement across six directions and database selection, effectively amplifying their capacity for meticulous observation and incisive analysis of the animated flow phenomena 
    more » « less
  4. In eye-tracked augmented and virtual reality (AR/VR), instantaneous and accurate hands-free selection of virtual elements is still a significant challenge. Though other methods that involve gaze-coupled head movements or hovering can improve selection times in comparison to methods like gaze-dwell, they are either not instantaneous or have difficulty ensuring that the user’s selection is deliberate. In this paper, we present EyeShadows, an eye gaze-based selection system that takes advantage of peripheral copies (shadows) of items that allow for quick selection and manipulation of an object or corresponding menus. This method is compatible with a variety of different selection tasks and controllable items, avoids the Midas touch problem, does not clutter the virtual environment, and is context sensitive. We have implemented and refined this selection tool for VR and AR, including testing with optical and video see-through (OST/VST) displays. Moreover, we demonstrate that this method can be used for a wide range of AR and VR applications, including manipulation of sliders or analog elements. We test its performance in VR against three other selection techniques, including dwell (baseline), an inertial reticle, and head-coupled selection. Results showed that selection with EyeShadows was significantly faster than dwell (baseline), outperforming in the select and search and select tasks by 29.8% and 15.7%, respectively, though error rates varied between tasks. 
    more » « less
  5. null (Ed.)
    Our work presents the primary account for exploring the forensics of immersive Virtual Reality (VR) systems and their social applications. The Social VR applications studied in this work include Bigscreen, Altspace VR, Rec Room and Facebook Spaces. We explored the two most widely adopted consumer VR systems: the HTC Vive and the Oculus Rift. Our tests examined the efficacy of reconstructing evidence from network traffic as well as the systems themselves. The results showed that a significant amount of forensically relevant data such as user names, user profile pictures, events, and system details may be recovered. We anticipate that this work will stimulate future research directions in VR and Augmented Reality (AR) forensics as it is an area that is understudied and needs more attention from the community. 
    more » « less