This paper proposes and evaluates the use of image classification for detailed, full-body human-robot tactile interaction. A camera positioned below a translucent robot skin captures shadows generated from human touch and infers social gestures from the captured images. This approach enables rich tactile interaction with robots without the need for the sensor arrays used in traditional social robot tactile skins. It also supports the use of touch interaction with non-rigid robots, achieves high-resolution sensing for robots with different sizes and shape of surfaces, and removes the requirement of direct contact with the robot. We demonstrate the idea with an inflatable robot and a standing-alone testing device, an algorithm for recognizing touch gestures from shadows that uses Densely Connected Convolutional Networks, and an algorithm for tracking positions of touch and hovering shadows. Our experiments show that the system can distinguish between six touch gestures under three lighting conditions with 87.5 - 96.0% accuracy, depending on the lighting, and can accurately track touch positions as well as infer motion activities in realistic interaction conditions. Additional applications for this method include interactive screens on inflatable robots and privacy-maintaining robots for the home.
Organic Piezoresistive Robotic Skin Sensor Fabrication, Integration and Characterization
Advanced applications for human-robot interaction require perception of physical touch in a manner that imitates the human tactile perception. Feedback generated from tactile sensor arrays can be used to control the interaction of a robot with their environment and other humans. In this paper, we present our efforts to fabricate piezoresistive organic polymer sensor arrays using PEDOT: PSS or poly (3,4-ethylenedioxythiophene)-poly(styrenesulfonate). Sensors are realized as strain-gauges on Kapton substrates with thermal and electrical response characteristics to human touch. In this paper, we detail fabrication processes associated with a Gold etching technique combined with a wet lift-off photolithographic process to implement a circular tree designed sensor microstructure in our cleanroom. The testing of this microstructure is done on a load testing apparatus facilitated by an integrated circuit design. Furthermore, a lamination process is employed to compensate for temperature drift while measuring pressure for double-sided sensor substrates. Experiments carried out to evaluate the performance of the fabricated structure, indicates 100% sensor yields with the updated technique implemented.
- Award ID(s):
- Publication Date:
- NSF-PAR ID:
- Journal Name:
- 16th International Manufacturing Science and Engineering Conference
- Sponsoring Org:
- National Science Foundation
More Like this
Pressure sensitive robotic skins have long been investigated for applications to physical human-robot interaction (pHRI). Numerous challenges related to fabrication, sensitivity, density, and reliability remain to be addressed under various environmental and use conditions. In our previous studies, we designed novel strain gauge sensor structures for robotic skin arrays. We coated these star-shaped designs with an organic polymer piezoresistive material, Poly (3, 4-ethylenedioxythiophene)-ploy(styrenesulfonate) or PEDOT: PSS and integrated sensor arrays into elastomer robotic skins. In this paper, we describe a dry etching photolithographic method to create a stable uniform sensor layer of PEDOT:PSS onto star-shaped sensors and a lamination process for creating double-sided robotic skins that can be used with temperature compensation. An integrated circuit and load testing apparatus was designed for testing the resulting robotic skin pressure performance. Experiments were conducted to measure the loading performance of the resulting sensor prototypes and results indicate that over 80% sensor yields are possible with this fabrication process.
There has been an increasing need of technologies to manufacturing chemical and biological sensors for various applications ranging from environmental monitoring to human health monitoring. Currently, manufacturing of most chemical and biological sensors relies on a variety of standard microfabrication techniques, such as physical vapor deposition and photolithography, and materials such as metals and semiconductors. Though functional, they are hampered by high cost materials, rigid substrates, and limited surface area. Paper based sensors offer an intriguing alternative that is low cost, mechanically flexible, has the inherent ability to filter and separate analytes, and offers a high surface area, permeable framework advantageous to liquid and vapor sensing. However, a major drawback is that standard microfabrication techniques cannot be used in paper sensor fabrication. To fabricate sensors on paper, low temperature additive techniques must be used, which will require new manufacturing processes and advanced functional materials. In this work, we focus on using aerosol jet printing as a highresolution additive process for the deposition of ink materials to be used in paper-based sensors. This technique can use a wide variety of materials with different viscosities, including materials with high porosity and particles inherent to paper. One area of our efforts involves creatingmore »
The most common sensing modalities found in a robot perception system are vision and touch, which together can provide global and highly localized data for manipulation. However, these sensing modalities often fail to adequately capture the behavior of target objects during the critical moments as they transition out of static, controlled contact with an end-effector to dynamic and uncontrolled motion. In this work, we present a novel multimodal visuotactile sensor that provides simultaneous visuotactile and proximity depth data. The sensor integrates an RGB camera and air pressure sensor to sense touch with an infrared time-of-flight (ToF) camera to sense proximity by leveraging a selectively transmissive soft membrane to enable the dual sensing modalities. We present the mechanical design, fabrication techniques, algorithm implementations, and evaluation of the sensor's tactile and proximity modalities. The sensor is demonstrated in three open-loop robotic tasks: approaching and contacting an object, catching, and throwing. The fusion of tactile and proximity data could be used to capture key information about a target object's transition behavior for sensor-based control in dynamic manipulation.
Principles from human-human physical interaction may be necessary to design more intuitive and seamless robotic devices to aid human movement. Previous studies have shown that light touch can aid balance and that haptic communication can improve performance of physical tasks, but the effects of touch between two humans on walking balance has not been previously characterized. This study examines physical interaction between two persons when one person aids another in performing a beam-walking task. 12 pairs of healthy young adults held a force sensor with one hand while one person walked on a narrow balance beam (2 cm wide x 3.7 m long) and the other person walked overground by their side. We compare balance performance during partnered vs. solo beam-walking to examine the effects of haptic interaction, and we compare hand interaction mechanics during partnered beam-walking vs. overground walking to examine how the interaction aided balance. While holding the hand of a partner, participants were able to walk further on the beam without falling, reduce lateral sway, and decrease angular momentum in the frontal plane. We measured small hand force magnitudes (mean of 2.2 N laterally and 3.4 N vertically) that created opposing torque components about the beam axis and calculated the interactionmore »