skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on February 1, 2026

Title: Interaction Glove for 3-D Virtual Environments Based on an RGB-D Camera and Magnetic, Angular Rate, and Gravity Micro-Electromechanical System Sensors
This paper presents the theoretical foundation, practical implementation, and empirical evaluation of a glove for interaction with 3-D virtual environments. At the dawn of the “Spatial Computing Era”, where users continuously interact with 3-D Virtual and Augmented Reality environments, the need for a practical and intuitive interaction system that can efficiently engage 3-D elements is becoming pressing. Over the last few decades, there have been attempts to provide such an interaction mechanism using a glove. However, glove systems are currently not in widespread use due to their high cost and, we propose, due to their inability to sustain high levels of performance under certain situations. Performance deterioration has been observed due to the distortion of the local magnetic field caused by ordinary ferromagnetic objects present near the glove’s operating space. There are several areas where reliable hand-tracking gloves could provide a next generation of improved solutions, such as American Sign Language training and automatic translation to text and training and evaluation for activities that require high motor skills in the hands (e.g., playing some musical instruments, training of surgeons, etc.). While the use of a hand-tracking glove toward these goals seems intuitive, some of the currently available glove systems may not meet the accuracy and reliability levels required for those use cases. This paper describes our concept of an interaction glove instrumented with miniature magnetic, angular rate, and gravity (MARG) sensors and aided by a single camera. The camera used is an off-the-shelf red, green, and blue–depth (RGB-D) camera. We describe a proof-of-concept implementation of the system using our custom “GMVDK” orientation estimation algorithm. This paper also describes the glove’s empirical evaluation with human-subject performance tests. The results show that the prototype glove, using the GMVDK algorithm, is able to operate without performance losses, even in magnetically distorted environments.  more » « less
Award ID(s):
1920182
PAR ID:
10637002
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
MDPI
Date Published:
Journal Name:
Information
Volume:
16
Issue:
2
ISSN:
2078-2489
Page Range / eLocation ID:
127
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Numerous applications of Virtual Reality (VR) and Augmented Reality (AR) continue to emerge. However, many of the current mechanisms to provide input in those environments still require the user to perform actions (e.g., press a number of buttons, tilt a stick) that are not natural or intuitive. It would be desirable to enable users of 3D virtual environments to use natural hand gestures to interact with the environments. The implementation of a glove capable of tracking the movement and configuration of a user’s hand has been pursued by multiple groups in the past. One of the most recent approaches consists of tracking the motion of the hand and fingers using miniature sensor modules with magnetic and inertial sensors. Unfortunately, the limited quality of the signals from those sensors and the frequent deviation from the assumptions made in the design of their operations have prevented the implementation of a tracking glove able to achieve high performance and large-scale acceptance. This paper describes our development of a proof-of-concept glove that incorporates motion sensors and a signal processing algorithm designed to maintain high tracking performance even in locations that are challenging to these sensors, (e.g., where the geomagnetic field is distorted by nearby ferromagnetic objects). We describe the integration of the required components, the rationale and outline of the tracking algorithms and the virtual reality environment in which the tracking results drive the movements of the model of a hand. We also describe the protocol that will be used to evaluate the performance of the glove. 
    more » « less
  2. Chen, Jessie Y; Fragomeni, G (Ed.)
    Numerous applications of Virtual Reality (VR) and Augmented Reality (AR) continue to emerge. However, many of the current mechanisms to provide input in those environments still require the user to perform actions (e.g., press a number of buttons, tilt a stick) that are not natural or intuitive. It would be desirable to enable users of 3D virtual environments to use natural hand gestures to interact with the environments. The implementation of a glove capable of tracking the movement and configuration of a user’s hand has been pursued by multiple groups in the past. One of the most recent approaches consists of tracking the motion of the hand and fingers using miniature sensor modules with magnetic and inertial sensors. Unfortunately, the limited quality of the signals from those sensors and the frequent deviation from the assumptions made in the design of their operations have prevented the implementation of a tracking glove able to achieve high performance and large-scale acceptance. This paper describes our development of a proof-of-concept glove that incorporates motion sensors and a signal processing algorithm designed to maintain high tracking performance even in locations that are challenging to these sensors, (e.g., where the geomagnetic field is distorted by nearby ferromagnetic objects). We describe the integration of the required components, the rationale and outline of the tracking algorithms and the virtual reality environment in which the tracking results drive the movements of the model of a hand. We also describe the protocol that will be used to evaluate the performance of the glove. 
    more » « less
  3. We present CrazyJoystick, a flyable handheld joystick allowing seamless interaction methods to change between joystick and hand-tracking while displaying on-demand haptic feedback in extended reality (XR). Our system comprises a quadrotor that can autonomously approach the user when needed, addressing the limitations of conventional handheld and wearable devices that require continuous carrying throughout interactions. CrazyJoystick dynamically reallocates all thrust for haptic rendering during stationary states, eliminating the need to hover while delivering feedback. A customized cage allows users to grasp the device and interact with virtual objects, receiving 3.5 degree-of-freedom feedback. This novel transition method allows us to harvest the aerial mobility from multi-rotor based haptic devices, while having high force-to-weight ratios from being handheld during interaction. This paper describes the design and implementation of CrazyJoystick, evaluates its force and torque performance, and usability of the system in three VR applications. Our evaluation of torque rendering found that users can perceive the direction with an accuracy of 92.2%. User studies further indicated that the system significantly improves presence in VR environments. Participants found on-demand haptic feedback intuitive and enjoyable, emphasizing the potential of CrazyJoystick to redefine immersive interactions in XR through portable and adaptive feedback mechanisms. 
    more » « less
  4. In this study, we developed a new haptic–mixed reality intravenous (HMR-IV) needle insertion simulation system, providing a bimanual haptic interface integrated into a mixed reality system with programmable variabilities considering real clinical environments. The system was designed for nursing students or healthcare professionals to practice IV needle insertion into a virtual arm with unlimited attempts under various changing insertion conditions (e.g., skin: color, texture, stiffness, friction; vein: size, shape, location depth, stiffness, friction). To achieve accurate hand–eye coordination under dynamic mixed reality scenarios, two different haptic devices (Dexmo and Geomagic Touch) and a standalone mixed reality system (HoloLens 2) were integrated and synchronized through multistep calibration for different coordinate systems (real world, virtual world, mixed reality world, haptic interface world, HoloLens camera). In addition, force-profile-based haptic rendering proposed in this study was able to successfully mimic the real tactile feeling of IV needle insertion. Further, a global hand-tracking method, combining two depth sensors (HoloLens and Leap Motion), was developed to accurately track a haptic glove and simulate grasping a virtual hand with force feedback. We conducted an evaluation study with 20 participants (9 experts and 11 novices) to measure the usability of the HMR-IV simulation system with user performance under various insertion conditions. The quantitative results from our own metric and qualitative results from the NASA Task Load Index demonstrate the usability of our system. 
    more » « less
  5. M. Kurosu and A. Hashizume (Ed.)
    There is increasing interest in using low-cost and lightweight Micro Electro-Mechanical System (MEMS) modules containing tri-axial accelerometers, gyroscopes and magnetometers for tracking the motion of segments of the human body. We are specifically interested in using these devices, called “Magnetic, Angular-Rate and Gravity” (“MARG”) modules, to develop an instrumented glove, assigning one of these MARG modules to monitor the (absolute) 3-D orientation of each of the proximal and middle phalanges of the fingers of a computer user. This would provide real-time monitoring of the hand gestures of the user, enabling non-vision gesture recognition approaches that do not degrade with lineof- sight disruptions or longer distance from the cameras. However, orientation estimation from low-cost MEMS MARG modules has shown to degrade in areas where the geomagnetic field is distorted by the presence of ferromagnetic objects (which are common in contemporary environments). This paper describes the continued evolution of our algorithm to obtain robust MARG orientation estimates, even in magnetically distorted environments. In particular, the paper describes a new self-contained version of the algorithm, i.e., one requiring no information from external devices, in contrast to the previous versions. Keywords: MARG module · Orientation Estimation · Magnetic Disturbance 
    more » « less