skip to main content


Title: Elastomeric Haptic Devices for Virtual and Augmented Reality
Abstract

Since the modern concepts for virtual and augmented reality are first introduced in the 1960's, the field has strived to develop technologies for immersive user experience in a fully or partially virtual environment. Despite the great progress in visual and auditory technologies, haptics has seen much slower technological advances. The challenge is because skin has densely packed mechanoreceptors distributed over a very large area with complex topography; devising an apparatus as targeted as an audio speaker or television for the localized sensory input of an ear canal or iris is more difficult. Furthermore, the soft and sensitive nature of the skin makes it difficult to apply solid state electronic solutions that can address large areas without causing discomfort. The maturing field of soft robotics offers potential solutions toward this challenge. In this article, the definition and history of virtual (VR) and augmented reality (AR) is first reviewed. Then an overview of haptic output and input technologies is presented, opportunities for soft robotics are identified, and mechanisms of intrinsically soft actuators and sensors are introduced. Finally, soft haptic output and input devices are reviewed with categorization by device forms, and examples of soft haptic devices in VR/AR environments are presented.

 
more » « less
Award ID(s):
1830924
NSF-PAR ID:
10449044
Author(s) / Creator(s):
 ;  ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
Advanced Functional Materials
Volume:
31
Issue:
39
ISSN:
1616-301X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less
  2. Abstract

    Emerging virtual and augmented reality technologies can transform human activities in myriad domains, lending tangible, embodied form to digital data, services, and information. Haptic technologies will play a critical role in enabling human to touch and interact with the contents of these virtual environments. The immense variety of skilled manual tasks that humans perform in real environments are only possible through the coordination of touch sensation, perception, and movement that together comprise the haptic modality. Consequently, many research groups are vigorously investigating haptic technologies for virtual reality. A longstanding research goal in this area has been to create haptic interfaces that allow their users to touch and feel plausibly realistic virtual objects. In this progress report, the perspective on this unresolved research challenge is shared, guided by the observation that no technologies can even approximately match the capabilities of the human sense of touch. Factors that have it challenging to engineer haptic technologies for virtual reality, including the extraordinary spatial and temporal tactile acuity of the skin, and the complex interplay between continuum mechanics, haptic perception, and interaction are identified. The perspective on how these challenges may be overcome through convergent research on haptic perception, mechanics, electronics, and material technologies is presented.

     
    more » « less
  3. Abstract

    Successful surgical operations are characterized by preplanning routines to be executed during actual surgical operations. To achieve this, surgeons rely on the experience acquired from the use of cadavers, enabling technologies like virtual reality (VR) and clinical years of practice. However, cadavers, having no dynamism and realism as they lack blood, can exhibit limited tissue degradation and shrinkage, while current VR systems do not provide amplified haptic feedback. This can impact surgical training increasing the likelihood of medical errors. This work proposes a novel Mixed Reality Combination System (MRCS) that pairs Augmented Reality (AR) technology and an inertial measurement unit (IMU) sensor with 3D printed, collagen-based specimens that can enhance task performance like planning and execution. To achieve this, the MRCS charts out a path prior to a user task execution based on a visual, physical, and dynamic environment on the state of a target object by utilizing surgeon-created virtual imagery that, when projected onto a 3D printed biospecimen as AR, reacts visually to user input on its actual physical state. This allows a real-time user reaction of the MRCS by displaying new multi-sensory virtual states of an object prior to performing on the actual physical state of that same object enabling effective task planning. Tracked user actions using an integrated 9-Degree of Freedom IMU demonstrate task execution This demonstrates that a user, with limited knowledge of specific anatomy, can, under guidance, execute a preplanned task. In addition, to surgical planning, this system can be generally applied in areas such as construction, maintenance, and education.

     
    more » « less
  4. Annotation in 3D user interfaces such as Augmented Reality (AR) and Virtual Reality (VR) is a challenging and promising area; however, there are not currently surveys reviewing these contributions. In order to provide a survey of annotations for Extended Reality (XR) environments, we conducted a structured literature review of papers that used annotation in their AR/VR systems from the period between 2001 and 2021. Our literature review process consists of several filtering steps which resulted in 103 XR publications with a focus on annotation. We classified these papers based on the display technologies, input devices, annotation types, target object under annotation, collaboration type, modalities, and collaborative technologies. A survey of annotation in XR is an invaluable resource for researchers and newcomers. Finally, we provide a database of the collected information for each reviewed paper. This information includes applications, the display technologies and its annotator, input devices, modalities, annotation types, interaction techniques, collaboration types, and tasks for each paper. This database provides a rapid access to collected data and gives users the ability to search or filter the required information. This survey provides a starting point for anyone interested in researching annotation in XR environments. 
    more » « less
  5. Drones are increasingly used during routine inspections of bridges to improve data consistency, work efficiency, inspector safety, and cost effectiveness. Most drones, however, are operated manually within a visual line of sight and thus unable to inspect long-span bridges that are not completely visible to operators. In this paper, aerial nondestructive evaluation (aNDE) will be envisioned for elevated structures such as bridges, buildings, dams, nuclear power plants, and tunnels. To enable aerial nondestructive testing (aNDT), a human-robot system will be created to integrate haptic sensing and dexterous manipulation into a drone or a structural crawler in augmented/virtual reality (AR/VR) for beyond-visual-line-of-sight (BVLOS) inspection of bridges. Some of the technical challenges and potential solutions associated with aNDT&E will be presented. Example applications of the advanced technologies will be demonstrated in simulated bridge decks with stipulated conditions. The developed human-robot system can transform current on-site inspection to future tele-inspection, minimizing impact to traffic passing over the bridges. The automated tele-inspection can save as much as 75% in time and 95% in cost.

     
    more » « less