skip to main content


Title: Chemical Haptics: Rendering Haptic Sensations via Topical Stimulants
We propose a new class of haptic devices that provide haptic sensations by delivering liquid-stimulants to the user's skin; we call this chemical haptics. Upon absorbing these stimulants, which contain safe and small doses of key active ingredients, receptors in the user's skin are chemically triggered, rendering distinct haptic sensations. We identified five chemicals that can render lasting haptic sensations: tingling (sanshool), numbing (lidocaine), stinging (cinnamaldehyde), warming (capsaicin), and cooling (menthol). To enable the application of our novel approach in a variety of settings (such as VR), we engineered a self-contained wearable that can be worn anywhere on the user's skin (e.g., face, arms, legs). Implemented as a soft silicone patch, our device uses micropumps to push the liquid stimulants through channels that are open to the user's skin, enabling topical stimulants to be absorbed by the skin as they pass through. Our approach presents two unique benefits. First, it enables sensations, such as numbing, not possible with existing haptic devices. Second, our approach offers a new pathway, via the skin's chemical receptors, for achieving multiple haptic sensations using a single actuator, which would otherwise require combining multiple actuators (e.g., Peltier, vibration motors, electro-tactile stimulation). We evaluated our approach by means of two studies. In our first study, we characterized the temporal profiles of sensations elicited by each chemical. Using these insights, we designed five interactive VR experiences utilizing chemical haptics, and in our second user study, participants rated these VR experiences with chemical haptics as more immersive than without. Finally, as the first work exploring the use of chemical haptics on the skin, we offer recommendations to designers for how they may employ our approach for their interactive experiences.  more » « less
Award ID(s):
2047189
NSF-PAR ID:
10317680
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
ACM Symposium on User Interface Software and Technology
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We propose a haptic device that alters the perceived softness of real rigid objects without requiring to instrument the objects. Instead, our haptic device works by restricting the user's fingerpad lateral deformation via a hollow frame that squeezes the sides of the fingerpad. This causes the fingerpad to become bulgier than it originally was—when users touch an object's surface with their now-restricted fingerpad, they feel the object to be softer than it is. To illustrate the extent of softness illusion induced by our device, touching the tip of a wooden chopstick will feel as soft as a rubber eraser. Our haptic device operates by pulling the hollow frame using a motor. Unlike most wearable haptic devices, which cover up the user's fingerpad to create force sensations, our device creates softness while leaving the center of the fingerpad free, which allows the users to feel most of the object they are interacting with. This makes our device a unique contribution to altering the softness of everyday objects, creating “buttons” by softening protrusions of existing appliances or tangibles, or even, altering the softness of handheld props for VR. Finally, we validated our device through two studies: (1) a psychophysics study showed that the device brings down the perceived softness of any object between 50A-90A to around 40A (on Shore A hardness scale); and (2) a user study demonstrated that participants preferred our device for interactive applications that leverage haptic props, such as making a VR prop feel softer or making a rigid 3D printed remote control feel softer on its button. 
    more » « less
  2. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less
  3. Electrical muscle stimulation (EMS) is an emergent technique that miniaturizes force feedback, especially popular for untethered haptic devices, such as mobile gaming, VR, or AR. However, the actuation displayed by interactive systems based on EMS is coarse and imprecise. EMS systems mostly focus on inducing movements in large muscle groups such as legs, arms, and wrists; whereas individual finger poses, which would be required, for example, to actuate a user's fingers to fingerspell even the simplest letters in sign language, are not possible. The lack of dexterity in EMS stems from two fundamental limitations: (1) lack of independence: when a particular finger is actuated by EMS, the current runs through nearby muscles, causing unwanted actuation of adjacent fingers; and, (2) unwanted oscillations: while it is relatively easy for EMS to start moving a finger, it is very hard for EMS to stop and hold that finger at a precise angle; because, to stop a finger, virtually all EMS systems contract the opposing muscle, typically achieved via controllers (e.g., PID)—unfortunately, even with the best controller tuning, this often results in unwanted oscillations. To tackle these limitations, we propose dextrEMS, an EMS-based haptic device featuring mechanical brakes attached to each finger joint. The key idea behind dextrEMS is that while the EMS actuates the fingers, it is our mechanical brake that stops the finger in a precise position. Moreover, it is also the brakes that allow dextrEMS to select which fingers are moved by EMS, eliminating unwanted movements by preventing adjacent fingers from moving. We implemented dextrEMS as an untethered haptic device, weighing only 68g, that actuates eight finger joints independently (metacarpophalangeal and proximal interphalangeal joints for four fingers), which we demonstrate in a wide range of haptic applications, such as assisted fingerspelling, a piano tutorial, guitar tutorial, and a VR game. Finally, in our technical evaluation, we found that dextrEMS outperformed EMS alone by doubling its independence and reducing unwanted oscillations. 
    more » « less
  4. Researchers, educators, and multimedia designers need to better understand how mixing physical tangible objects with virtual experiences affects learning and science identity. In this novel study, a 3D-printed tangible that is an accurate facsimile of the sort of expensive glassware that chemists use in real laboratories is tethered to a laptop with a digitized lesson. Interactive educational content is increasingly being placed online, it is important to understand the educational boundary conditions associated with passive haptics and 3D-printed manipulables. Cost-effective printed objects would be particularly welcome in rural and low Socio-Economic (SES) classrooms. A Mixed Reality (MR) experience was created that used a physical 3D-printed haptic burette to control a computer-based chemistry titration experiment. This randomized control trial study with 136 college students had two conditions: 1) low-embodied control (using keyboard arrows), and 2) high-embodied experimental (physically turning a valve/stopcock on the 3D-printed burette). Although both groups displayed similar significant gains on the declarative knowledge test, deeper analyses revealed nuanced Aptitude by Treatment Interactions (ATIs). These interactionsfavored the high-embodied experimental group that used the MR devicefor both titration-specific posttest knowledge questions and for science efficacy and science identity. Those students with higher prior science knowledge displayed higher titration knowledge scores after using the experimental 3D-printed haptic device. A multi-modal linguistic and gesture analysis revealed that during recall the experimental participants used the stopcock-turning gesture significantly more often, and their recalls created a significantly different Epistemic Network Analysis (ENA). ENA is a type of 2D projection of the recall data, stronger connections were seen in the high embodied group mainly centering on the key hand-turning gesture. Instructors and designers should consider the multi-modal and multi-dimensional nature of the user interface, and how the addition of another sensory-based learning signal (haptics) might differentially affect lower prior knowledge students. One hypothesis is that haptically manipulating novel devices during learning may create more cognitive load. For low prior knowledge students, it may be advantageous for them to begin learning content on a more ubiquitous interface (e.g., keyboard) before moving them to more novel, multi-modal MR devices/interfaces.

     
    more » « less
  5. Haptic interfaces can be used to add sensations of touch to virtual and augmented reality experiences. Soft, flexible devices that deliver spatiotemporal patterns of touch across the body, potentially with full-body coverage, are of particular interest for a range of applications in medicine, sports and gaming. Here we report a wireless haptic interface of this type, with the ability to display vibro-tactile patterns across large areas of the skin in single units or through a wirelessly coordinated collection of them. The lightweight and flexible designs of these systems incorporate arrays of vibro-haptic actuators at a density of 0.73 actuators per square centimetre, which exceeds the two-point discrimination threshold for mechanical sensation on the skin across nearly all the regions of the body except the hands and face. A range of vibrant sensations and information content can be passed to mechanoreceptors in the skin via time-dependent patterns and amplitudes of actuation controlled through the pressure-sensitive touchscreens of smart devices, in real-time with negligible latency. We show that this technology can be used to convey navigation instructions, to translate musical tracks into tactile patterns and to support sensory replacement feedback for the control of robotic prosthetics. 
    more » « less