Resonant frequency skin stretch uses cyclic lateral skin stretches matching the skin’s resonant frequency to create highly noticeable stimuli, signifying a new approach for wearable haptic stimulation. Four experiments were performed to explore biomechanical and perceptual aspects of resonant frequency skin stretch. In the first experiment, effective skin resonant frequencies were quantified at the forearm, shank, and foot. In the second experiment, perceived haptic stimuli were characterized for skin stretch actuations across a spectrum of frequencies. In the third experiment, perceived haptic stimuli were characterized for different actuator masses. In the fourth experiment, haptic classification ability was determined as subjects differentiated haptic stimulation cues while sitting, walking, and jogging. Results showed that subjects perceived stimulations at, above, and below the skin’s resonant frequency differently: stimulations lower than the skin resonant frequency felt like distinct impacts, stimulations at the skin resonant frequency felt like cyclic skin stretches, and stimulations higher than the skin resonant frequency felt like standard vibrations. Subjects successfully classified stimulations while sitting, walking, and jogging, perceived haptic stimuli was affected by actuator mass, and classification accuracy decreased with increasing speed, especially for stimulations at the shank. This work could facilitate more widespread use of wearable skin stretch. Potential applications include gaming, medical simulation, and surgical augmentation, and for training to reduce injury risk or improve sports performance.
more »
« less
Realism of Tactile Texture Playback: A Combination of Stretch and Vibration
Not AvailableThis study investigates the effects of two stimulation modalities (stretch and vibration) on natural touch sensation on the volar forearm. The skin-textile interaction was implemented by scanning three natural textures across the left forearm. The resulting in-plane displacements across the skin were recorded by the digital image correlation technique to capture the information imparted by the textures. The texture recordings were used to create three playback modes (stretch, vibration, and both), which were reproduced on the right forearm. Two psychophysical experiments compared the physical texture scans to rendered texture playbacks. The first experiment used a matching task and found that to maximize perceptual realism, i.e., similarity to a physical reference, subjects preferred the rendered texture to have a playback intensity of approximately 1X – 2X higher on DC components (stretch), and 1X – 3.5X higher on AC components (vibration), varying across textures. The second experiment elicited similarity ratings between the texture scans and playbacks and showed that a combination of both stretch and vibration was required to create differentiated texture sensations. However, the intensity amplification and use of both stretch and vibration were still insufficient to create fully realistic texture sensations. We conclude that mechanisms beyond singlesite uniaxial stimuli are needed to reproduce realistic textural sensations.
more »
« less
- Award ID(s):
- 2106191
- PAR ID:
- 10652002
- Publisher / Repository:
- IEEE
- Date Published:
- Journal Name:
- IEEE Transactions on Haptics
- Volume:
- 17
- Issue:
- 3
- ISSN:
- 1939-1412
- Page Range / eLocation ID:
- 441 to 450
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract The ability to render realistic texture perception using haptic devices has been consistently challenging. A key component of texture perception is roughness. When we touch surfaces, mechanoreceptors present under the skin are activated and the information is processed by the nervous system, enabling perception of roughness/smoothness. Several distributed haptic devices capable of producing localized skin stretch have been developed with the aim of rendering realistic roughness perception; however, current state-of-the-art devices rely on device fabrication and psychophysical experimentation to determine whether a device configuration will perform as desired. Predictive models can elucidate physical mechanisms, providing insight and a more effective design iteration process. Since existing models (1, 2) are derived from responses to normal stimuli only, they cannot predict the performance of laterally actuated devices which rely on frictional shear forces to produce localized skin stretch. They are also unable to predict the augmentation of roughness perception when the actuators are spatially dispersed across the contact patch or actuated with a relative phase difference (3). In this study, we have developed a model that can predict the perceived roughness for arbitrary external stimuli and validated it against psychophysical experimental results from different haptic devices reported in the literature. The model elucidates two key mechanisms: (i) the variation in the change of strain across the contact patch can predict roughness perception with strong correlation and (ii) the inclusion of lateral shear forces is essential to correctly predict roughness perception. Using the model can accelerate device optimization by obviating the reliance on trial-and-error approaches.more » « less
-
Across a plethora of social situations, we touch others in natural and intuitive ways to share thoughts and emotions, such as tapping to get one’s attention or caressing to soothe one’s anxiety. A deeper understanding of these human-to-human interactions will require, in part, the precise measurement of skin-to-skin physical contact. Among prior efforts, each measurement approach exhibits certain constraints, e.g., motion trackers do not capture the precise shape of skin surfaces, while pressure sensors impede skin-to-skin contact. In contrast, this work develops an interference-free 3D visual tracking system using a depth camera to measure the contact attributes between the bare hand of a toucher and the forearm of a receiver. The toucher’s hand is tracked as a posed and positioned mesh by fitting a hand model to detected 3D hand joints, whereas a receiver’s forearm is extracted as a 3D surface updated upon repeated skin contact. Based on a contact model involving point clouds, the spatiotemporal changes of hand-to-forearm contact are decomposed as six, high-resolution, time-series contact attributes, i.e., contact area, indentation depth, absolute velocity, and three orthogonal velocity components, together with contact duration. To examine the system’s capabilities and limitations, two types of experiments were performed. First, to evaluate its ability to discern human touches, one person delivered cued social messages, e.g., happiness, anger, sympathy, to another person using their preferred gestures. The results indicated that messages and gestures, as well as the identities of the touchers, were readily discerned from their contact attributes. Second, the system’s spatiotemporal accuracy was validated against measurements from independent devices, including an electromagnetic motion tracker, sensorized pressure mat, and laser displacement sensor. While validated here in the context of social communication, this system is extendable to human touch interactions such as maternal care of infants and massage therapy.more » « less
-
null (Ed.)In this work, we investigated the classification of texture by neuromorphic tactile encoding and an unsupervised learning method. Additionally, we developed an adaptive classification algorithm to detect and characterize the presence of new texture data. The neuromorphic tactile encoding of textures from a multilayer tactile sensor was based on the physical structure and afferent spike signaling of human glabrous skin mechanoreceptors. We explored different neuromorphic spike pattern metrics and dimensionality reduction techniques in order to maximize classification accuracy while improving computational efficiency. Using a dataset composed of 3 textures, we showed that unsupervised learning of the neuromorphic tactile encoding data had high classification accuracy (mean=86.46%, sd=5 .44%). Moreover, the adaptive classification algorithm was successful at determining that there were 3 underlying textures in the training dataset. In this work, tactile information is transformed into neuromorphic spiking activity that can be used as a stimulation pattern to elicit texture sensation for prosthesis users. Furthermore, we provide the basis for identifying new textures adaptively which can be used to actively modify stimulation patterns to improve texture discrimination for the user.more » « less
-
While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.more » « less
An official website of the United States government

