- Award ID(s):
- 2047189
- NSF-PAR ID:
- 10317680
- Date Published:
- Journal Name:
- ACM Symposium on User Interface Software and Technology
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
We propose a haptic device that alters the perceived softness of real rigid objects without requiring to instrument the objects. Instead, our haptic device works by restricting the user's fingerpad lateral deformation via a hollow frame that squeezes the sides of the fingerpad. This causes the fingerpad to become bulgier than it originally was—when users touch an object's surface with their now-restricted fingerpad, they feel the object to be softer than it is. To illustrate the extent of softness illusion induced by our device, touching the tip of a wooden chopstick will feel as soft as a rubber eraser. Our haptic device operates by pulling the hollow frame using a motor. Unlike most wearable haptic devices, which cover up the user's fingerpad to create force sensations, our device creates softness while leaving the center of the fingerpad free, which allows the users to feel most of the object they are interacting with. This makes our device a unique contribution to altering the softness of everyday objects, creating “buttons” by softening protrusions of existing appliances or tangibles, or even, altering the softness of handheld props for VR. Finally, we validated our device through two studies: (1) a psychophysics study showed that the device brings down the perceived softness of any object between 50A-90A to around 40A (on Shore A hardness scale); and (2) a user study demonstrated that participants preferred our device for interactive applications that leverage haptic props, such as making a VR prop feel softer or making a rigid 3D printed remote control feel softer on its button.more » « less
-
Abstract Since the modern concepts for virtual and augmented reality are first introduced in the 1960's, the field has strived to develop technologies for immersive user experience in a fully or partially virtual environment. Despite the great progress in visual and auditory technologies, haptics has seen much slower technological advances. The challenge is because skin has densely packed mechanoreceptors distributed over a very large area with complex topography; devising an apparatus as targeted as an audio speaker or television for the localized sensory input of an ear canal or iris is more difficult. Furthermore, the soft and sensitive nature of the skin makes it difficult to apply solid state electronic solutions that can address large areas without causing discomfort. The maturing field of soft robotics offers potential solutions toward this challenge. In this article, the definition and history of virtual (VR) and augmented reality (AR) is first reviewed. Then an overview of haptic output and input technologies is presented, opportunities for soft robotics are identified, and mechanisms of intrinsically soft actuators and sensors are introduced. Finally, soft haptic output and input devices are reviewed with categorization by device forms, and examples of soft haptic devices in VR/AR environments are presented.
-
While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.more » « less
-
Haptic interfaces can be used to add sensations of touch to virtual and augmented reality experiences. Soft, flexible devices that deliver spatiotemporal patterns of touch across the body, potentially with full-body coverage, are of particular interest for a range of applications in medicine, sports and gaming. Here we report a wireless haptic interface of this type, with the ability to display vibro-tactile patterns across large areas of the skin in single units or through a wirelessly coordinated collection of them. The lightweight and flexible designs of these systems incorporate arrays of vibro-haptic actuators at a density of 0.73 actuators per square centimetre, which exceeds the two-point discrimination threshold for mechanical sensation on the skin across nearly all the regions of the body except the hands and face. A range of vibrant sensations and information content can be passed to mechanoreceptors in the skin via time-dependent patterns and amplitudes of actuation controlled through the pressure-sensitive touchscreens of smart devices, in real-time with negligible latency. We show that this technology can be used to convey navigation instructions, to translate musical tracks into tactile patterns and to support sensory replacement feedback for the control of robotic prosthetics.more » « less
-
Wearable haptic devices transmit information via touch receptors in the skin, yet devices located on parts of the body with high densities of receptors, such as fingertips and hands, impede interactions. Other locations that are well‐suited for wearables, such as the wrists and arms, suffer from lower perceptual sensitivity. The emergence of textile‐based wearable devices introduces new techniques of fabrication that can be leveraged to address these constraints and enable new modes of haptic interactions. This article formalizes the concept of “multiscale” interaction, an untapped paradigm for haptic wearables, enabling enhanced delivery of information via textile‐based haptic modules. In this approach, users choose the depth and detail of their haptic experiences by varying their interaction mode. Flexible prototyping methods enable multiscale haptic bands that provide both body‐scale interactions (on the forearm) and hand‐scale interactions (on the fingers and palm). A series of experiments assess participants’ ability to identify pressure states and spatial locations delivered by these bands across these interaction scales. A final experiment demonstrates the encoding of three‐bit information into prototypical multiscale interactions, showcasing the paradigm's efficacy. This research lays the groundwork for versatile haptic communication and wearable design, offering users the ability to select interaction modes for receiving information.more » « less