skip to main content


Title: ShadowSense: Detecting Human Touch in a Social Robot Using Shadow Image Classification
This paper proposes and evaluates the use of image classification for detailed, full-body human-robot tactile interaction. A camera positioned below a translucent robot skin captures shadows generated from human touch and infers social gestures from the captured images. This approach enables rich tactile interaction with robots without the need for the sensor arrays used in traditional social robot tactile skins. It also supports the use of touch interaction with non-rigid robots, achieves high-resolution sensing for robots with different sizes and shape of surfaces, and removes the requirement of direct contact with the robot. We demonstrate the idea with an inflatable robot and a standing-alone testing device, an algorithm for recognizing touch gestures from shadows that uses Densely Connected Convolutional Networks, and an algorithm for tracking positions of touch and hovering shadows. Our experiments show that the system can distinguish between six touch gestures under three lighting conditions with 87.5 - 96.0% accuracy, depending on the lighting, and can accurately track touch positions as well as infer motion activities in realistic interaction conditions. Additional applications for this method include interactive screens on inflatable robots and privacy-maintaining robots for the home.  more » « less
Award ID(s):
1830471
NSF-PAR ID:
10291126
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume:
4
Issue:
4
ISSN:
2474-9567
Page Range / eLocation ID:
1 to 24
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Effective interactions between humans and robots are vital to achieving shared tasks in collaborative processes. Robots can utilize diverse communication channels to interact with humans, such as hearing, speech, sight, touch, and learning. Our focus, amidst the various means of interactions between humans and robots, is on three emerging frontiers that significantly impact the future directions of human–robot interaction (HRI): (i) human–robot collaboration inspired by human–human collaboration, (ii) brain-computer interfaces, and (iii) emotional intelligent perception. First, we explore advanced techniques for human–robot collaboration, covering a range of methods from compliance and performance-based approaches to synergistic and learning-based strategies, including learning from demonstration, active learning, and learning from complex tasks. Then, we examine innovative uses of brain-computer interfaces for enhancing HRI, with a focus on applications in rehabilitation, communication, brain state and emotion recognition. Finally, we investigate the emotional intelligence in robotics, focusing on translating human emotions to robots via facial expressions, body gestures, and eye-tracking for fluid, natural interactions. Recent developments in these emerging frontiers and their impact on HRI were detailed and discussed. We highlight contemporary trends and emerging advancements in the field. Ultimately, this paper underscores the necessity of a multimodal approach in developing systems capable of adaptive behavior and effective interaction between humans and robots, thus offering a thorough understanding of the diverse modalities essential for maximizing the potential of HRI.

     
    more » « less
  2. Abstract

    Enhancing physical human-robot interaction requires the improvement in the tactile perception of physical touch. Robot skin sensors exhibiting piezoresistive behavior can be used in conjunction with collaborative robots. In past work, fabrication of these tactile arrays was done using cleanroom techniques such as spin coating, photolithography, sputtering, wet and dry etching onto flexible polymers. In this paper, we present an addictive, non-cleanroom improved process of depositing PEDOT: PSS, which is the organic polymer responsible for the piezoresistive phenomenon of the robot skin sensor arrays. This publication details the patterning of the robot skin sensor structures and the adaptation of the inkjet printing technology to the fabrication process. This increases the possibility of scaling the production output while reducing the cleanroom fabrication cost and time from an approximately five-hour PEDOT: PSS deposition process to five minutes. Furthermore, the testing of these skin sensor arrays is carried out on a testing station equipped with a force plunger and an integrated circuit designed to provide perception feedback on various force load profiles controlled in an automated process. The results show uniform deposition of the PEDOT: PSS, consistent resistance measurement, and appropriate tactile response across an array of 16 sensors.

     
    more » « less
  3. Madden, John D. ; Anderson, Iain A. ; Shea, Herbert R. (Ed.)
    Ras Labs makes Synthetic Muscle™, which is a class of electroactive polymer (EAP) based materials and actuators that sense pressure (gentle touch to high impact), controllably contract and expand at low voltage (1.5 V to 50 V, including use of batteries), and attenuate force. We are in the robotics era, but robots do have their challenges. Currently, robotic sensing is mainly visual, which is useful up until the point of contact. To understand how an object is being gripped, tactile feedback is needed. For handling fragile objects, if the grip is too tight, breakage occurs, and if the grip is too loose, the object will slip out of the grasp, also leading to breakage. Rigid robotic grippers using a visual feedback loop can struggle to determine the exact point and quality of contact. Robotic grippers can also get a stuttering effect in the visual feedback loop. By using soft Synthetic Muscle™ based EAP pads as the sensors, immediate feedback was generated at the first point of contact. Because these pads provided a soft, compliant interface, the first point of contact did not apply excessive force, allowing the force applied to the object to be controlled. The EAP sensor could also detect a change in pressure location on its surface, making it possible to detect and prevent slippage by then adjusting the grip strength. In other words, directional glide provided feedback for the presence of possible slippage to then be able to control a slightly tighter grip, without stutter, due to both the feedback and the soft gentleness of the fingertip-like EAP pads themselves. The soft nature of the EAP fingertip pad also naturally held the gripped object, improving the gripping quality over rigid grippers without an increase in applied force. Analogous to finger-like tactile touch, the EAPs with appropriate coatings and electronics were positioned as pressure sensors in the fingertip or end effector regions of robotic grippers. This development of using Synthetic Muscle™ based EAPs as soft sensors provided for sensors that feel like the pads of human fingertips. Basic pressure position and magnitude tests have been successful, with pressure sensitivity down to 0.05 N. Most automation and robots are very strong, very fast, and usually need to be partitioned away from humans for safety reasons. For many repetitive tasks that humans do with delicate or fragile objects, it would be beneficial to use robotics; whether it is for agriculture, medical surgery, therapeutic or personal care, or in extreme environments where humans cannot enter, including with contagions that have no cure. Synthetic Muscle™ was also retrofitted as actuator systems into off-the-shelf robotic grippers and is being considered in novel biomimetic gripper designs, operating at low voltages (less than 50 V). This offers biomimetic movement by contracting like human muscles, but also exceeds natural biological capabilities by expanding under reversed electric polarity. Human grasp is gentle yet firm, with tactile touch feedback. In conjunction with shape-morphing abilities, these EAPs also are being explored to intrinsically sense pressure due to the correlation between mechanical force applied to the EAP and its electronic signature. The robotic field is experiencing phenomenal growth in this fourth phase of the industrial revolution, the robotics era. The combination of Ras Labs’ EAP shape-morphing and sensing features promises the potential for robotic grippers with human hand-like control and tactile sensing. This work is expected to advance both robotics and prosthetics, particularly for collaborative robotics to allow humans and robots to intuitively work safely and effectively together. 
    more » « less
  4. Advanced applications for human-robot interaction require perception of physical touch in a manner that imitates the human tactile perception. Feedback generated from tactile sensor arrays can be used to control the interaction of a robot with their environment and other humans. In this paper, we present our efforts to fabricate piezoresistive organic polymer sensor arrays using PEDOT: PSS or poly (3,4-ethylenedioxythiophene)-poly(styrenesulfonate). Sensors are realized as strain-gauges on Kapton substrates with thermal and electrical response characteristics to human touch. In this paper, we detail fabrication processes associated with a Gold etching technique combined with a wet lift-off photolithographic process to implement a circular tree designed sensor microstructure in our cleanroom. The testing of this microstructure is done on a load testing apparatus facilitated by an integrated circuit design. Furthermore, a lamination process is employed to compensate for temperature drift while measuring pressure for double-sided sensor substrates. Experiments carried out to evaluate the performance of the fabricated structure, indicates 100% sensor yields with the updated technique implemented. 
    more » « less
  5. null (Ed.)
    This work has developed an iteratively refined understanding of participants’ natural perceptions and responses to unmanned aerial vehicle (UAV) flight paths, or gestures. This includes both what they believe the UAV is trying to communicate to them, in addition to how they expect to respond through physical action. Previous work in this area has focused on eliciting gestures from participants to communicate specific states, or leveraging gestures that are observed in the world rather than on understanding what the participants believe is being communicated and how they would respond. This work investigates previous gestures either created or categorized by participants to understand the perceived content of their communication or expected response, through categories created by participant free responses and confirmed through forced choice testing. The human-robot interaction community can leverage this work to better understand how people perceive UAV flight paths, inform future designs for non-anthropomorphic robot communications, and apply lessons learned to elicit informative labels from people who may or may not be operating the vehicle. We found that the Negative Attitudes towards Robots Scale (NARS) can be a good indicator of how we can expect a person to react to a robot. Recommendations are also provided to use motion approaching/retreating from a person to encourage following, perpendicular to their field of view for blocking, and to use either no motion or large altitude changes to encourage viewing. 
    more » « less