skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 10:00 PM ET on Friday, December 8 until 2:00 AM ET on Saturday, December 9 due to maintenance. We apologize for the inconvenience.


Title: Comparing the Fidelity of Contemporary Pointing with Controller Interactions on Performance of Personal Space Target Selection
The goal of this research is to provide much needed empirical data on how the fidelity of popular hand gesture tracked based pointing metaphors versus commodity controller based input affects the efficiency and speed-accuracy tradeoff in users’ spatial selection in personal space interactions in VR. We conduct two experiments in which participants select spherical targets arranged in a circle in personal space, or near-field within their maximum arms reach distance, in VR. Both experiments required participants to select the targets with either a VR controller or with their dominant hand’s index finger, which was tracked with one of two popular contemporary tracking methods. In the first experiment, the targets are arranged in a flat circle in accordance with the ISO 9241-9 Fitts’ law standard, and the simulation selected random combinations of 3 target amplitudes and 3 target widths. Targets were placed centered around the users’ eye level, and the arrangement was placed at either 60%, 75%, or 90% depth plane of the users’ maximum arm’s reach. In experiment 2, the targets varied in depth randomly from one depth plane to another within the same configuration of 13 targets within a trial set, which resembled button selection task in hierarchical menus in differing depth planes in the near-field. The study was conducted using the HTC Vive head-mounted display, and used either a VR controller (HTC Vive), low-fidelity virtual pointing (Leap Motion), or a high-fidelity virtual pointing (tracked VR glove) conditions. Our results revealed that low-fidelity pointing performed worse than both high-fidelity pointing and the VR controller. Overall, target selection performance was found to be worse in depth planes closer to the maximum arms reach, as compared to middle and nearer distances.  more » « less
Award ID(s):
2007435
NSF-PAR ID:
10437595
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
Page Range / eLocation ID:
404 to 413
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Problem-solving focuses on defining and analyzing problems, then finding viable solutions through an iterative process that requires brainstorming and understanding of what is known and what is unknown in the problem space. With rapid changes of economic landscape in the United States, new types of jobs emerge when new industries are created. Employers report that problem-solving is the most important skill they are looking for in job applicants. However, there are major concerns about the lack of problem-solving skills in engineering students. This lack of problem-solving skills calls for an approach to measure and enhance these skills. In this research, we propose to understand and improve problem-solving skills in engineering education by integrating eye-tracking sensing with virtual reality (VR) manufacturing. First, we simulate a manufacturing system in a VR game environment that we call a VR learning factory. The VR learning factory is built in the Unity game engine with the HTC Vive VR system for navigation and motion tracking. The headset is custom-fitted with Tobii eye-tracking technology, allowing the system to identify the coordinates and objects that a user is looking at, at any given time during the simulation. In the environment, engineering students can see through the headset a virtual manufacturing environment composed of a series of workstations and are able to interact with workpieces in the virtual environment. For example, a student can pick up virtual plastic bricks and assemble them together using the wireless controller in hand. Second, engineering students are asked to design and assemble car toys that satisfy predefined customer requirements while minimizing the total cost of production. Third, data-driven models are developed to analyze eye-movement patterns of engineering students. For instance, problem-solving skills are measured by the extent to which the eye-movement patterns of engineering students are similar to the pattern of a subject matter expert (SME), an ideal person who sets the expert criterion for the car toy assembly process. Benchmark experiments are conducted with a comprehensive measure of performance metrics such as cycle time, the number of station switches, weight, price, and quality of car toys. Experimental results show that eye-tracking modeling is efficient and effective to measure problem-solving skills of engineering students. The proposed VR learning factory was integrated into undergraduate manufacturing courses to enhance student learning and problem-solving skills. 
    more » « less
  2. null (Ed.)
    Like many natural sciences, a critical component of archaeology is field work. Despite its importance, field opportunities are available to few students for financial and logistical reasons. With little exposure to archaeological research, fewer students are entering archaeology, particularly minority students (Smith 2004; Wilson 2015). To counter these trends, we have leveraged the ongoing revolution in consumer electronics for the current, digitally-empowered generation by creating a game-based, virtual archaeology curriculum to 1) teach foundational principles of a discipline that is challenging to present in a traditional classroom by using sensory and cognitive immersion; and, 2) allow wider access to a field science that has previously been limited to only select students. Virtual reality (VR) is computer technology that creates a simulated three-dimensional world for a user to experience in a bodily way, thereby transforming data analysis into a sensory and cognitive experience. Using a widely-available, room-scale, VR platform, we have created a virtual archaeological excavation experience that conveys two overarching classroom objectives: 1) teach the physical methods of archaeological excavation by providing the setting and tools for a student to actively engage in field work; and, 2) teach archaeological concepts using a scientific approach to problem solving by couching them within a role-playing game. The current prototype was developed with the HTC Vive VR platform, which includes a headset, hand controllers, and two base stations to track the position and orientation of the user’s head and hands within a 4x4 meter area. Environments were developed using Unreal Engine 4, an open source gaming engine, to maximize usability for different audiences, learning objectives, and skill levels. Given the inherent fun of games and widespread interest in archaeology and cultural heritage, the results of this research are adaptable and applicable to learners of all ages in formal and informal educational settings. 
    more » « less
  3. In this video, we show high-fidelity numerical results of supersonic spatially-developing turbulent boundary layers (SDTBL) under strong concave and concave curvatures and Mach = 2.86. The selected numerical tool is Direct Numerical Simulation (DNS) with high spatial/temporal resolution. The prescribed concave geometry is based on the experimental study by Donovan et al. (J. Fluid Mech., 259, 1-24, 1994). Turbulent inflow conditions are based on extracted data from a previous DNS over a flat plate (i.e., turbulence precursors). The comprehensive DNS information sheds important light on the transport phenomena inside turbulent boundary layers subject to strong deceleration or Adverse Pressure Gradient (APG) caused by concave walls as well as to strong acceleration or Favorable Pressure Gradient (FPG) caused by convex walls at different wall thermal conditions (i.e., cold, adiabatic and hot walls). In this opportunity, the selected scientific visualization tool is Virtual Reality (VR) by extracting vortex core iso-surfaces via the Q-criterion to convert them to a file format readable by the HTC Vive VR toolkit. The reader is invited to visit our Virtual Wind Tunnel (VWT) under a fully immersive environment for further details. The video is available at: https://gfm.aps.org/meetings/dfd-2022/6313a60c199e4c2da9a946bc 
    more » « less
  4. Emerging Virtual Reality (VR) displays with embedded eye trackers are currently becoming a commodity hardware (e.g., HTC Vive Pro Eye). Eye-tracking data can be utilized for several purposes, including gaze monitoring, privacy protection, and user authentication/identification. Identifying users is an integral part of many applications due to security and privacy concerns. In this paper, we explore methods and eye-tracking features that can be used to identify users. Prior VR researchers explored machine learning on motion-based data (such as body motion, head tracking, eye tracking, and hand tracking data) to identify users. Such systems usually require an explicit VR task and many features to train the machine learning model for user identification. We propose a system to identify users utilizing minimal eye-gaze-based features without designing any identification-specific tasks. We collected gaze data from an educational VR application and tested our system with two machine learning (ML) models, random forest (RF) and k-nearest-neighbors (kNN), and two deep learning (DL) models: convolutional neural networks (CNN) and long short-term memory (LSTM). Our results show that ML and DL models could identify users with over 98% accuracy with only six simple eye-gaze features. We discuss our results, their implications on security and privacy, and the limitations of our work. 
    more » « less
  5. This is one of the first accounts for the security analysis of consumer immersive Virtual Reality (VR) systems. This work breaks new ground, coins new terms, and constructs proof of concept implementations of attacks related to immersive VR. Our work used the two most widely adopted immersive VR systems, the HTC Vive, and the Oculus Rift. More specifically, we were able to create attacks that can potentially disorient users, turn their Head Mounted Display (HMD) camera on without their knowledge, overlay images in their field of vision, and modify VR environmental factors that force them into hitting physical objects and walls. Finally, we illustrate through a human participant deception study the success of being able to exploit VR systems to control immersed users and move them to a location in physical space without their knowledge. We term this the Human Joystick Attack. We conclude our work with future research directions and ways to enhance the security of these systems. 
    more » « less