This content will become publicly available on June 1, 2025
- Award ID(s):
- 1920182
- PAR ID:
- 10541692
- Editor(s):
- Chen, Jessie Y; Fragomeni, G
- Publisher / Repository:
- Springer, Cham - Lecture Notes in Computer Science (LNCS, volume 14707),
- Date Published:
- Page Range / eLocation ID:
- 283–297
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Numerous applications of Virtual Reality (VR) and Augmented Reality (AR) continue to emerge. However, many of the current mechanisms to provide input in those environments still require the user to perform actions (e.g., press a number of buttons, tilt a stick) that are not natural or intuitive. It would be desirable to enable users of 3D virtual environments to use natural hand gestures to interact with the environments. The implementation of a glove capable of tracking the movement and configuration of a user’s hand has been pursued by multiple groups in the past. One of the most recent approaches consists of tracking the motion of the hand and fingers using miniature sensor modules with magnetic and inertial sensors. Unfortunately, the limited quality of the signals from those sensors and the frequent deviation from the assumptions made in the design of their operations have prevented the implementation of a tracking glove able to achieve high performance and large-scale acceptance. This paper describes our development of a proof-of-concept glove that incorporates motion sensors and a signal processing algorithm designed to maintain high tracking performance even in locations that are challenging to these sensors, (e.g., where the geomagnetic field is distorted by nearby ferromagnetic objects). We describe the integration of the required components, the rationale and outline of the tracking algorithms and the virtual reality environment in which the tracking results drive the movements of the model of a hand. We also describe the protocol that will be used to evaluate the performance of the glove.more » « less
-
In this study, we developed a new haptic–mixed reality intravenous (HMR-IV) needle insertion simulation system, providing a bimanual haptic interface integrated into a mixed reality system with programmable variabilities considering real clinical environments. The system was designed for nursing students or healthcare professionals to practice IV needle insertion into a virtual arm with unlimited attempts under various changing insertion conditions (e.g., skin: color, texture, stiffness, friction; vein: size, shape, location depth, stiffness, friction). To achieve accurate hand–eye coordination under dynamic mixed reality scenarios, two different haptic devices (Dexmo and Geomagic Touch) and a standalone mixed reality system (HoloLens 2) were integrated and synchronized through multistep calibration for different coordinate systems (real world, virtual world, mixed reality world, haptic interface world, HoloLens camera). In addition, force-profile-based haptic rendering proposed in this study was able to successfully mimic the real tactile feeling of IV needle insertion. Further, a global hand-tracking method, combining two depth sensors (HoloLens and Leap Motion), was developed to accurately track a haptic glove and simulate grasping a virtual hand with force feedback. We conducted an evaluation study with 20 participants (9 experts and 11 novices) to measure the usability of the HMR-IV simulation system with user performance under various insertion conditions. The quantitative results from our own metric and qualitative results from the NASA Task Load Index demonstrate the usability of our system.
-
Locus is a NIME designed specifically for an interactive, immersive high density loudspeaker array environment. The system is based on a pointing mechanism to interact with a sound scene comprising 128 speakers. Users can point anywhere to interact with the system, and the spatial interaction utilizes motion capture, so it does not require a screen. Instead it is completely controlled via hand gestures using a glove that is populated with motion-tracking markers. The main purpose of this system is to offer intuitive physical interaction with the perimeter based spatial sound sources. Further, its goal is to minimize user-worn technology and thereby enhance freedom of motion by utilizing environmental sensing devices, such as motion capture cameras or infrared sensors. The ensuing creativity enabling technology is applicable to a broad array of possible scenarios, from researching limits of human spatial hearing perception to facilitating learning and artistic performances, including dance. Below we describe our NIME design and implementation, its preliminary assessment, and offer a Unity-based toolkit to facilitate its broader deployment and adoption.more » « less
-
As eye tracking can reduce the computational burden of virtual reality devices through a technique known as foveated rendering, we believe not only that eye tracking will be implemented in all virtual reality devices, but that eye tracking biometrics will become the standard method of authentication in virtual reality. Thus, we have created a real-time eye movement-driven authentication system for virtual reality devices. In this work, we describe the architecture of the system and provide a specific implementation that is done using the FOVE head-mounted display. We end with an exploration into future topics of research to spur thought and discussion.more » « less
-
Haptic devices are in general more adept at mimicking the bulk properties of materials than they are at mimicking the surface properties. Herein, a haptic glove is described which is capable of producing sensations reminiscent of three types of near‐surface properties: hardness, temperature, and roughness. To accomplish this mixed mode of stimulation, three types of haptic actuators are combined: vibrotactile motors, thermoelectric devices, and electrotactile electrodes made from a stretchable conductive polymer synthesized in the laboratory. This polymer consists of a stretchable polyanion which serves as a scaffold for the polymerization of poly(3,4‐ethylenedioxythiophene). The scaffold is synthesized using controlled radical polymerization to afford material of low dispersity, relatively high conductivity, and low impedance relative to metals. The glove is equipped with flex sensors to make it possible to control a robotic hand and a hand in virtual reality (VR). In psychophysical experiments, human participants are able to discern combinations of electrotactile, vibrotactile, and thermal stimulation in VR. Participants trained to associate these sensations with roughness, hardness, and temperature have an overall accuracy of 98%, whereas untrained participants have an accuracy of 85%. Sensations can similarly be conveyed using a robotic hand equipped with sensors for pressure and temperature.