skip to main content


Title: Chemical Haptics: Rendering Haptic Sensations via Topical Stimulants
We propose a new class of haptic devices that provide haptic sensations by delivering liquid-stimulants to the user's skin; we call this chemical haptics. Upon absorbing these stimulants, which contain safe and small doses of key active ingredients, receptors in the user's skin are chemically triggered, rendering distinct haptic sensations. We identified five chemicals that can render lasting haptic sensations: tingling (sanshool), numbing (lidocaine), stinging (cinnamaldehyde), warming (capsaicin), and cooling (menthol). To enable the application of our novel approach in a variety of settings (such as VR), we engineered a self-contained wearable that can be worn anywhere on the user's skin (e.g., face, arms, legs). Implemented as a soft silicone patch, our device uses micropumps to push the liquid stimulants through channels that are open to the user's skin, enabling topical stimulants to be absorbed by the skin as they pass through. Our approach presents two unique benefits. First, it enables sensations, such as numbing, not possible with existing haptic devices. Second, our approach offers a new pathway, via the skin's chemical receptors, for achieving multiple haptic sensations using a single actuator, which would otherwise require combining multiple actuators (e.g., Peltier, vibration motors, electro-tactile stimulation). We evaluated our approach by means of two studies. In our first study, we characterized the temporal profiles of sensations elicited by each chemical. Using these insights, we designed five interactive VR experiences utilizing chemical haptics, and in our second user study, participants rated these VR experiences with chemical haptics as more immersive than without. Finally, as the first work exploring the use of chemical haptics on the skin, we offer recommendations to designers for how they may employ our approach for their interactive experiences.  more » « less
Award ID(s):
2047189
NSF-PAR ID:
10317680
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
ACM Symposium on User Interface Software and Technology
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We propose a haptic device that alters the perceived softness of real rigid objects without requiring to instrument the objects. Instead, our haptic device works by restricting the user's fingerpad lateral deformation via a hollow frame that squeezes the sides of the fingerpad. This causes the fingerpad to become bulgier than it originally was—when users touch an object's surface with their now-restricted fingerpad, they feel the object to be softer than it is. To illustrate the extent of softness illusion induced by our device, touching the tip of a wooden chopstick will feel as soft as a rubber eraser. Our haptic device operates by pulling the hollow frame using a motor. Unlike most wearable haptic devices, which cover up the user's fingerpad to create force sensations, our device creates softness while leaving the center of the fingerpad free, which allows the users to feel most of the object they are interacting with. This makes our device a unique contribution to altering the softness of everyday objects, creating “buttons” by softening protrusions of existing appliances or tangibles, or even, altering the softness of handheld props for VR. Finally, we validated our device through two studies: (1) a psychophysics study showed that the device brings down the perceived softness of any object between 50A-90A to around 40A (on Shore A hardness scale); and (2) a user study demonstrated that participants preferred our device for interactive applications that leverage haptic props, such as making a VR prop feel softer or making a rigid 3D printed remote control feel softer on its button. 
    more » « less
  2. Abstract

    Since the modern concepts for virtual and augmented reality are first introduced in the 1960's, the field has strived to develop technologies for immersive user experience in a fully or partially virtual environment. Despite the great progress in visual and auditory technologies, haptics has seen much slower technological advances. The challenge is because skin has densely packed mechanoreceptors distributed over a very large area with complex topography; devising an apparatus as targeted as an audio speaker or television for the localized sensory input of an ear canal or iris is more difficult. Furthermore, the soft and sensitive nature of the skin makes it difficult to apply solid state electronic solutions that can address large areas without causing discomfort. The maturing field of soft robotics offers potential solutions toward this challenge. In this article, the definition and history of virtual (VR) and augmented reality (AR) is first reviewed. Then an overview of haptic output and input technologies is presented, opportunities for soft robotics are identified, and mechanisms of intrinsically soft actuators and sensors are introduced. Finally, soft haptic output and input devices are reviewed with categorization by device forms, and examples of soft haptic devices in VR/AR environments are presented.

     
    more » « less
  3. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less
  4. Haptic interfaces can be used to add sensations of touch to virtual and augmented reality experiences. Soft, flexible devices that deliver spatiotemporal patterns of touch across the body, potentially with full-body coverage, are of particular interest for a range of applications in medicine, sports and gaming. Here we report a wireless haptic interface of this type, with the ability to display vibro-tactile patterns across large areas of the skin in single units or through a wirelessly coordinated collection of them. The lightweight and flexible designs of these systems incorporate arrays of vibro-haptic actuators at a density of 0.73 actuators per square centimetre, which exceeds the two-point discrimination threshold for mechanical sensation on the skin across nearly all the regions of the body except the hands and face. A range of vibrant sensations and information content can be passed to mechanoreceptors in the skin via time-dependent patterns and amplitudes of actuation controlled through the pressure-sensitive touchscreens of smart devices, in real-time with negligible latency. We show that this technology can be used to convey navigation instructions, to translate musical tracks into tactile patterns and to support sensory replacement feedback for the control of robotic prosthetics. 
    more » « less
  5. Wearable haptic devices transmit information via touch receptors in the skin, yet devices located on parts of the body with high densities of receptors, such as fingertips and hands, impede interactions. Other locations that are well‐suited for wearables, such as the wrists and arms, suffer from lower perceptual sensitivity. The emergence of textile‐based wearable devices introduces new techniques of fabrication that can be leveraged to address these constraints and enable new modes of haptic interactions. This article formalizes the concept of “multiscale” interaction, an untapped paradigm for haptic wearables, enabling enhanced delivery of information via textile‐based haptic modules. In this approach, users choose the depth and detail of their haptic experiences by varying their interaction mode. Flexible prototyping methods enable multiscale haptic bands that provide both body‐scale interactions (on the forearm) and hand‐scale interactions (on the fingers and palm). A series of experiments assess participants’ ability to identify pressure states and spatial locations delivered by these bands across these interaction scales. A final experiment demonstrates the encoding of three‐bit information into prototypical multiscale interactions, showcasing the paradigm's efficacy. This research lays the groundwork for versatile haptic communication and wearable design, offering users the ability to select interaction modes for receiving information. 
    more » « less