The goal of the field of haptics is to create technologies that manipulate the sense of touch. In virtual and augmented reality, haptic devices are for touch what loudspeakers and RGB displays are for hearing and vision. Haptic systems that utilize micromotors or other miniaturized mechanical devices (e.g., for vibration and pneumatic actuation) produce interesting effects, but are quite far from reproducing the feeling of real materials. They are especially deficient in recapitulating surface properties: fine texture, friction, viscoelasticity, tack, and softness. The central argument of this progress report is that in order to reproduce the feel of everyday objects, molecular control must be established over the properties of materials; ultimately, such control will enable the design of materials which can change these properties in real time. Stimuli‐responsive organic materials, such as polymers and composites, are a class of materials which can change their oxidation state, conductivity, shape, and rheological properties, and thus might be useful in future haptic technologies. Moreover, the use of such materials in research on tactile perception could help elucidate the limits of human tactile sensitivity. The work described represents the beginnings of this new area of inquiry, in which the defining approach is the marriage of materials science and psychology.
Emerging virtual and augmented reality technologies can transform human activities in myriad domains, lending tangible, embodied form to digital data, services, and information. Haptic technologies will play a critical role in enabling human to touch and interact with the contents of these virtual environments. The immense variety of skilled manual tasks that humans perform in real environments are only possible through the coordination of touch sensation, perception, and movement that together comprise the haptic modality. Consequently, many research groups are vigorously investigating haptic technologies for virtual reality. A longstanding research goal in this area has been to create haptic interfaces that allow their users to touch and feel plausibly realistic virtual objects. In this progress report, the perspective on this unresolved research challenge is shared, guided by the observation that no technologies can even approximately match the capabilities of the human sense of touch. Factors that have it challenging to engineer haptic technologies for virtual reality, including the extraordinary spatial and temporal tactile acuity of the skin, and the complex interplay between continuum mechanics, haptic perception, and interaction are identified. The perspective on how these challenges may be overcome through convergent research on haptic perception, mechanics, electronics, and material technologies is presented.
more » « less- PAR ID:
- 10449799
- Publisher / Repository:
- Wiley Blackwell (John Wiley & Sons)
- Date Published:
- Journal Name:
- Advanced Functional Materials
- Volume:
- 31
- Issue:
- 39
- ISSN:
- 1616-301X
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract -
Touch as a modality in social communication has been getting more attention with recent developments in wearable technology and an increase in awareness of how limited physical contact can lead to touch starvation and feelings of depression. Although several mediated touch methods have been developed for conveying emotional support, the transfer of emotion through mediated touch has not been widely studied. This work addresses this need by exploring emotional communication through a novel wearable haptic system. The system records physical touch patterns through an array of force sensors, processes the recordings using novel gesture-based algorithms to create actuator control signals, and generates mediated social touch through an array of voice coil actuators. We conducted a human subject study ( N = 20) to understand the perception and emotional components of this mediated social touch for common social touch gestures, including poking, patting, massaging, squeezing, and stroking. Our results show that the speed of the virtual gesture significantly alters the participants' ratings of valence, arousal, realism, and comfort of these gestures with increased speed producing negative emotions and decreased realism. The findings from the study will allow us to better recognize generic patterns from human mediated touch perception and determine how mediated social touch can be used to convey emotion. Our system design, signal processing methods, and results can provide guidance in future mediated social touch design.more » « less
-
Recent advances in extended reality (XR) technologies make seeing and hearing virtual objects commonplace, yet strategies for synthesizing haptic interactions with virtual objects continue to be limited. Two design principles govern the rendering of believable and intuitive haptic feedback: movement through open space must feel “free” while contact with virtual objects must feel stiff. Herein, a novel multisensory approach that conveys proprioception and effort through illusory visual feedback and refers to the wrist, via a bracelet interface, discrete and continuous interaction forces that would otherwise occur at the hands and fingertips, is presented. Results demonstrate that users reliably discriminate the stiffness of virtual buttons when provided with multisensory pseudohaptic feedback, comprising tactile pseudohaptic feedback (discrete vibrotactile feedback and continuous squeeze cues in a bracelet interface) and visual pseudohaptic illusions of touch interactions. Compared to the use of tactile or visual pseudohaptic feedback alone, multisensory pseudohaptic feedback expands the range of physical stiffnesses that are intuitively associated with the rendered virtual interactions and reduces individual differences in physical‐to‐virtual stiffness mappings. This multisensory approach, which leaves users' hands unencumbered, provides a flexible framework for synthesizing a wide array of touch‐enabled interactions in XR, with great potential for enhancing user experiences.
-
Abstract Since the modern concepts for virtual and augmented reality are first introduced in the 1960's, the field has strived to develop technologies for immersive user experience in a fully or partially virtual environment. Despite the great progress in visual and auditory technologies, haptics has seen much slower technological advances. The challenge is because skin has densely packed mechanoreceptors distributed over a very large area with complex topography; devising an apparatus as targeted as an audio speaker or television for the localized sensory input of an ear canal or iris is more difficult. Furthermore, the soft and sensitive nature of the skin makes it difficult to apply solid state electronic solutions that can address large areas without causing discomfort. The maturing field of soft robotics offers potential solutions toward this challenge. In this article, the definition and history of virtual (VR) and augmented reality (AR) is first reviewed. Then an overview of haptic output and input technologies is presented, opportunities for soft robotics are identified, and mechanisms of intrinsically soft actuators and sensors are introduced. Finally, soft haptic output and input devices are reviewed with categorization by device forms, and examples of soft haptic devices in VR/AR environments are presented.
-
In this study, we developed a new haptic–mixed reality intravenous (HMR-IV) needle insertion simulation system, providing a bimanual haptic interface integrated into a mixed reality system with programmable variabilities considering real clinical environments. The system was designed for nursing students or healthcare professionals to practice IV needle insertion into a virtual arm with unlimited attempts under various changing insertion conditions (e.g., skin: color, texture, stiffness, friction; vein: size, shape, location depth, stiffness, friction). To achieve accurate hand–eye coordination under dynamic mixed reality scenarios, two different haptic devices (Dexmo and Geomagic Touch) and a standalone mixed reality system (HoloLens 2) were integrated and synchronized through multistep calibration for different coordinate systems (real world, virtual world, mixed reality world, haptic interface world, HoloLens camera). In addition, force-profile-based haptic rendering proposed in this study was able to successfully mimic the real tactile feeling of IV needle insertion. Further, a global hand-tracking method, combining two depth sensors (HoloLens and Leap Motion), was developed to accurately track a haptic glove and simulate grasping a virtual hand with force feedback. We conducted an evaluation study with 20 participants (9 experts and 11 novices) to measure the usability of the HMR-IV simulation system with user performance under various insertion conditions. The quantitative results from our own metric and qualitative results from the NASA Task Load Index demonstrate the usability of our system.