Social touch is a common method of communication between individuals, but touch cues alone provide only a glimpse of the entire interaction. Visual and auditory cues are also present in these interactions, and increase the expressiveness and recognition of the conveyed information. However, most mediated touch interactions have focused on providing only haptic cues to the user. Our research addresses this gap by adding visual cues to a mediated social touch interaction through an array of LEDs attached to a wearable device. This device consists of an array of voice-coil actuators that present normal force to the user’s forearm to recreate the sensation of social touch gestures. We conducted a human subject study (N = 20) to determine the relative importance of the touch and visual cues. Our results demonstrate that visual cues, particularly color and pattern, significantly enhance perceived realism, as well as alter perceived touch intensity, valence, and dominance of the mediated social touch. These results illustrate the importance of closely integrating multisensory cues to create more expressive and realistic virtual interactions.
more »
« less
Understanding the Effect of Speed on Human Emotion Perception in Mediated Social Touch Using Voice Coil Actuators
Touch as a modality in social communication has been getting more attention with recent developments in wearable technology and an increase in awareness of how limited physical contact can lead to touch starvation and feelings of depression. Although several mediated touch methods have been developed for conveying emotional support, the transfer of emotion through mediated touch has not been widely studied. This work addresses this need by exploring emotional communication through a novel wearable haptic system. The system records physical touch patterns through an array of force sensors, processes the recordings using novel gesture-based algorithms to create actuator control signals, and generates mediated social touch through an array of voice coil actuators. We conducted a human subject study ( N = 20) to understand the perception and emotional components of this mediated social touch for common social touch gestures, including poking, patting, massaging, squeezing, and stroking. Our results show that the speed of the virtual gesture significantly alters the participants' ratings of valence, arousal, realism, and comfort of these gestures with increased speed producing negative emotions and decreased realism. The findings from the study will allow us to better recognize generic patterns from human mediated touch perception and determine how mediated social touch can be used to convey emotion. Our system design, signal processing methods, and results can provide guidance in future mediated social touch design.
more »
« less
- Award ID(s):
- 2047867
- PAR ID:
- 10392584
- Date Published:
- Journal Name:
- Frontiers in Computer Science
- Volume:
- 4
- ISSN:
- 2624-9898
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Due to the COVID-19 crisis, social distancing has been a necessary and effective means of reducing disease through decreased close human contact. However, there has been a corresponding increase in touch starvation due to limited physical contact. Our research seeks to create a solution for allowing individuals to safely communicate through touch over a distance. Our system consists of wearable sensors to measure the social touch gesture, which is then processed and sent to an array of voice coils in an actuator device.more » « less
-
Despite advances in digitizing vision and hearing, touch still lacks an equivalent digital interface matching the fidelity of human perception. This gap limits the quality of digital tactile information and the realism of virtual experiences. Here, we introduce a step toward human-resolution haptics: a class of wearable tactile displays designed to match the spatial and temporal acuity of the human fingertip. Our device, VoxeLite, is a 0.1-millimeter-thick, 0.19-gram, skin-conformal array of individually addressable soft electroadhesive actuators (“nodes”). As users touch and move across surfaces, VoxeLite delivers high-resolution distributed forces via the nodes. Enabled by scalable microfabrication techniques, the display achieves actuator densities up to 110 nodes per square centimeter, produces stimuli up to 800 hertz, and remains transparent to real-world tactile input. We demonstrate its ability to render small-scale hapticons and virtual textures and transmit physical surfaces, validated through human psychophysics and biomimetic sensing. These findings position VoxeLite as a platform for human-resolution haptics in immersive interfaces, robotics, and digital touch communication.more » « less
-
Social touch provides a rich non-verbal communication channel between humans and robots. Prior work has identified a set of touch gestures for human-robot interaction and described them with natural language labels (e.g., stroking, patting). Yet, no data exists on the semantic relationships between the touch gestures in users’ minds. To endow robots with touch intelligence, we investigated how people perceive the similarities of social touch labels from the literature. In an online study, 45 participants grouped 36 social touch labels based on their perceived similarities and annotated their groupings with descriptive names. We derived quantitative similarities of the gestures from these groupings and analyzed the similarities using hierarchical clustering. The analysis resulted in 9 clusters of touch gestures formed around the social, emotional, and contact characteristics of the gestures. We discuss the implications of our results for designing and evaluating touch sensing and interactions with social robots.more » « less
-
Abstract Effective interactions between humans and robots are vital to achieving shared tasks in collaborative processes. Robots can utilize diverse communication channels to interact with humans, such as hearing, speech, sight, touch, and learning. Our focus, amidst the various means of interactions between humans and robots, is on three emerging frontiers that significantly impact the future directions of human–robot interaction (HRI): (i) human–robot collaboration inspired by human–human collaboration, (ii) brain-computer interfaces, and (iii) emotional intelligent perception. First, we explore advanced techniques for human–robot collaboration, covering a range of methods from compliance and performance-based approaches to synergistic and learning-based strategies, including learning from demonstration, active learning, and learning from complex tasks. Then, we examine innovative uses of brain-computer interfaces for enhancing HRI, with a focus on applications in rehabilitation, communication, brain state and emotion recognition. Finally, we investigate the emotional intelligence in robotics, focusing on translating human emotions to robots via facial expressions, body gestures, and eye-tracking for fluid, natural interactions. Recent developments in these emerging frontiers and their impact on HRI were detailed and discussed. We highlight contemporary trends and emerging advancements in the field. Ultimately, this paper underscores the necessity of a multimodal approach in developing systems capable of adaptive behavior and effective interaction between humans and robots, thus offering a thorough understanding of the diverse modalities essential for maximizing the potential of HRI.more » « less
An official website of the United States government

