skip to main content


Title: Haptic Perception, Mechanics, and Material Technologies for Virtual Reality
Abstract

Emerging virtual and augmented reality technologies can transform human activities in myriad domains, lending tangible, embodied form to digital data, services, and information. Haptic technologies will play a critical role in enabling human to touch and interact with the contents of these virtual environments. The immense variety of skilled manual tasks that humans perform in real environments are only possible through the coordination of touch sensation, perception, and movement that together comprise the haptic modality. Consequently, many research groups are vigorously investigating haptic technologies for virtual reality. A longstanding research goal in this area has been to create haptic interfaces that allow their users to touch and feel plausibly realistic virtual objects. In this progress report, the perspective on this unresolved research challenge is shared, guided by the observation that no technologies can even approximately match the capabilities of the human sense of touch. Factors that have it challenging to engineer haptic technologies for virtual reality, including the extraordinary spatial and temporal tactile acuity of the skin, and the complex interplay between continuum mechanics, haptic perception, and interaction are identified. The perspective on how these challenges may be overcome through convergent research on haptic perception, mechanics, electronics, and material technologies is presented.

 
more » « less
NSF-PAR ID:
10449799
Author(s) / Creator(s):
 ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
Advanced Functional Materials
Volume:
31
Issue:
39
ISSN:
1616-301X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    The goal of the field of haptics is to create technologies that manipulate the sense of touch. In virtual and augmented reality, haptic devices are for touch what loudspeakers and RGB displays are for hearing and vision. Haptic systems that utilize micromotors or other miniaturized mechanical devices (e.g., for vibration and pneumatic actuation) produce interesting effects, but are quite far from reproducing the feeling of real materials. They are especially deficient in recapitulating surface properties: fine texture, friction, viscoelasticity, tack, and softness. The central argument of this progress report is that in order to reproduce the feel of everyday objects, molecular control must be established over the properties of materials; ultimately, such control will enable the design of materials which can change these properties in real time. Stimuli‐responsive organic materials, such as polymers and composites, are a class of materials which can change their oxidation state, conductivity, shape, and rheological properties, and thus might be useful in future haptic technologies. Moreover, the use of such materials in research on tactile perception could help elucidate the limits of human tactile sensitivity. The work described represents the beginnings of this new area of inquiry, in which the defining approach is the marriage of materials science and psychology.

     
    more » « less
  2.  
    more » « less
  3. Touch as a modality in social communication has been getting more attention with recent developments in wearable technology and an increase in awareness of how limited physical contact can lead to touch starvation and feelings of depression. Although several mediated touch methods have been developed for conveying emotional support, the transfer of emotion through mediated touch has not been widely studied. This work addresses this need by exploring emotional communication through a novel wearable haptic system. The system records physical touch patterns through an array of force sensors, processes the recordings using novel gesture-based algorithms to create actuator control signals, and generates mediated social touch through an array of voice coil actuators. We conducted a human subject study ( N = 20) to understand the perception and emotional components of this mediated social touch for common social touch gestures, including poking, patting, massaging, squeezing, and stroking. Our results show that the speed of the virtual gesture significantly alters the participants' ratings of valence, arousal, realism, and comfort of these gestures with increased speed producing negative emotions and decreased realism. The findings from the study will allow us to better recognize generic patterns from human mediated touch perception and determine how mediated social touch can be used to convey emotion. Our system design, signal processing methods, and results can provide guidance in future mediated social touch design. 
    more » « less
  4. Gonzalez, D. (Ed.)

    Today’s research on human-robot teaming requires the ability to test artificial intelligence (AI) algorithms for perception and decision-making in complex real-world environments. Field experiments, also referred to as experiments “in the wild,” do not provide the level of detailed ground truth necessary for thorough performance comparisons and validation. Experiments on pre-recorded real-world data sets are also significantly limited in their usefulness because they do not allow researchers to test the effectiveness of active robot perception and control or decision strategies in the loop. Additionally, research on large human-robot teams requires tests and experiments that are too costly even for the industry and may result in considerable time losses when experiments go awry. The novel Real-Time Human Autonomous Systems Collaborations (RealTHASC) facility at Cornell University interfaces real and virtual robots and humans with photorealistic simulated environments by implementing new concepts for the seamless integration of wearable sensors, motion capture, physics-based simulations, robot hardware and virtual reality (VR). The result is an extended reality (XR) testbed by which real robots and humans in the laboratory are able to experience virtual worlds, inclusive of virtual agents, through real-time visual feedback and interaction. VR body tracking by DeepMotion is employed in conjunction with the OptiTrack motion capture system to transfer every human subject and robot in the real physical laboratory space into a synthetic virtual environment, thereby constructing corresponding human/robot avatars that not only mimic the behaviors of the real agents but also experience the virtual world through virtual sensors and transmit the sensor data back to the real human/robot agent, all in real time. New cross-domain synthetic environments are created in RealTHASC using Unreal Engine™, bridging the simulation-to-reality gap and allowing for the inclusion of underwater/ground/aerial autonomous vehicles, each equipped with a multi-modal sensor suite. The experimental capabilities offered by RealTHASC are demonstrated through three case studies showcasing mixed real/virtual human/robot interactions in diverse domains, leveraging and complementing the benefits of experimentation in simulation and in the real world.

     
    more » « less
  5. Abstract

    Since the modern concepts for virtual and augmented reality are first introduced in the 1960's, the field has strived to develop technologies for immersive user experience in a fully or partially virtual environment. Despite the great progress in visual and auditory technologies, haptics has seen much slower technological advances. The challenge is because skin has densely packed mechanoreceptors distributed over a very large area with complex topography; devising an apparatus as targeted as an audio speaker or television for the localized sensory input of an ear canal or iris is more difficult. Furthermore, the soft and sensitive nature of the skin makes it difficult to apply solid state electronic solutions that can address large areas without causing discomfort. The maturing field of soft robotics offers potential solutions toward this challenge. In this article, the definition and history of virtual (VR) and augmented reality (AR) is first reviewed. Then an overview of haptic output and input technologies is presented, opportunities for soft robotics are identified, and mechanisms of intrinsically soft actuators and sensors are introduced. Finally, soft haptic output and input devices are reviewed with categorization by device forms, and examples of soft haptic devices in VR/AR environments are presented.

     
    more » « less