skip to main content


Title: Multimodal Proximity and Visuotactile Sensing With a Selectively Transmissive Soft Membrane
The most common sensing modalities found in a robot perception system are vision and touch, which together can provide global and highly localized data for manipulation. However, these sensing modalities often fail to adequately capture the behavior of target objects during the critical moments as they transition out of static, controlled contact with an end-effector to dynamic and uncontrolled motion. In this work, we present a novel multimodal visuotactile sensor that provides simultaneous visuotactile and proximity depth data. The sensor integrates an RGB camera and air pressure sensor to sense touch with an infrared time-of-flight (ToF) camera to sense proximity by leveraging a selectively transmissive soft membrane to enable the dual sensing modalities. We present the mechanical design, fabrication techniques, algorithm implementations, and evaluation of the sensor's tactile and proximity modalities. The sensor is demonstrated in three open-loop robotic tasks: approaching and contacting an object, catching, and throwing. The fusion of tactile and proximity data could be used to capture key information about a target object's transition behavior for sensor-based control in dynamic manipulation.  more » « less
Award ID(s):
1935294
NSF-PAR ID:
10379073
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
022 IEEE 5th International Conference on Soft Robotics (RoboSoft)
Page Range / eLocation ID:
802 to 808
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. During in-hand manipulation, robots must be able to continuously estimate the pose of the object in order to generate appropriate control actions. The performance of algorithms for pose estimation hinges on the robot's sensors being able to detect discriminative geometric object features, but previous sensing modalities are unable to make such measurements robustly. The robot's fingers can occlude the view of environment- or robot-mounted image sensors, and tactile sensors can only measure at the local areas of contact. Motivated by fingertip-embedded proximity sensors' robustness to occlusion and ability to measure beyond the local areas of contact, we present the first evaluation of proximity sensor based pose estimation for in-hand manipulation. We develop a novel two-fingered hand with fingertip-embedded optical time-of-flight proximity sensors as a testbed for pose estimation during planar in-hand manipulation. Here, the in-hand manipulation task consists of the robot moving a cylindrical object from one end of its workspace to the other. We demonstrate, with statistical significance, that proximity-sensor based pose estimation via particle filtering during in-hand manipulation: a) exhibits 50% lower average pose error than a tactile-sensor based baseline; b) empowers a model predictive controller to achieve 30% lower final positioning error compared to when using tactile-sensor based pose estimates. 
    more » « less
  2. We describe a single fingertip-mounted sensing system for robot manipulation that provides proximity (pre-touch), contact detection (touch), and force sensing (post-touch). The sensor system consists of optical time-of-flight range measurement modules covered in a clear elastomer. Because the elastomer is clear, the sensor can detect and range nearby objects, as well as measure deformations caused by objects that are in contact with the sensor and thereby estimate the applied force. We examine how this sensor design can be improved with respect to invariance to object reflectivity, signal-to-noise ratio, and continuous operation when switching between the distance and force measurement regimes. By harnessing time-of-flight technology and optimizing the elastomer-air boundary to control the emitted light's path, we develop a sensor that is able to seamlessly transition between measuring distances of up to 50 mm and contact forces of up to 10 newtons. We demonstrate that our sensor improves manipulation accuracy in a block unstacking task. Thorough instructions for manufacturing the sensor from inexpensive, commercially available components are provided, as well as all relevant hardware design files and software sources. 
    more » « less
  3. null (Ed.)
    This paper proposes and evaluates the use of image classification for detailed, full-body human-robot tactile interaction. A camera positioned below a translucent robot skin captures shadows generated from human touch and infers social gestures from the captured images. This approach enables rich tactile interaction with robots without the need for the sensor arrays used in traditional social robot tactile skins. It also supports the use of touch interaction with non-rigid robots, achieves high-resolution sensing for robots with different sizes and shape of surfaces, and removes the requirement of direct contact with the robot. We demonstrate the idea with an inflatable robot and a standing-alone testing device, an algorithm for recognizing touch gestures from shadows that uses Densely Connected Convolutional Networks, and an algorithm for tracking positions of touch and hovering shadows. Our experiments show that the system can distinguish between six touch gestures under three lighting conditions with 87.5 - 96.0% accuracy, depending on the lighting, and can accurately track touch positions as well as infer motion activities in realistic interaction conditions. Additional applications for this method include interactive screens on inflatable robots and privacy-maintaining robots for the home. 
    more » « less
  4. This paper introduces a vision-based tactile sensor FingerVision, and explores its usefulness in tactile behaviors. FingerVision consists of a transparent elastic skin marked with dots, and a camera that is easy to fabricate, low cost, and physically robust. Unlike other vision-based tactile sensors, the complete transparency of the FingerVision skin provides multimodal sensation. The modalities sensed by FingerVision include distributions of force and slip, and object information such as distance, location, pose, size, shape, and texture. The slip detection is very sensitive since it is obtained by computer vision directly applied to the output from the FingerVision camera. It provides high-resolution slip detection, which does not depend on the contact force, i.e., it can sense slip of a lightweight object that generates negligible contact force. The tactile behaviors explored in this paper include manipulations that utilize this feature. For example, we demonstrate that grasp adaptation with FingerVision can grasp origami, and other deformable and fragile objects such as vegetables, fruits, and raw eggs. 
    more » « less
  5. Sensitive and flexible pressure sensors have invoked considerable interest for a broad range of applications in tactile sensing, physiological sensing, and flexible electronics. The barrier between high sensitivity and low fabrication cost needs to be addressed to commercialize such flexible pressure sensors. A low-cost sacrificial template-assisted method for the capacitive sensor has been reported herein, utilizing a porous Polydimethylsiloxane (PDMS) polymer and a multiwalled carbon nanotube (MWCNT) composite-based dielectric layer. The sensor shows high sensitivity of 2.42 kPa−1 along with a low limit of detection of 1.46 Pa. The high sensitivity originates from adding MWCNT to PDMS, increasing the composite polymer’s dielectric constant. Besides this, the pressure sensor shows excellent stability at a cyclic loading of 9000 cycles, proving its reliability for long-lasting application in tactile and physiological sensing. The high sensitivity of the sensor is suitable for the detection of small deformations such as pulse waveforms as well as tactile pressure sensing. In addition, the paper demonstrates a simultaneous contact and non-contact sensing capability suitable for dual sensing (pressure and proximity) with a single data readout system. The dual-mode sensing capability may open opportunities for realizing compact systems in robotics, gesture control, contactless applications, and many more. The practicality of the sensor was shown in applications such as tactile sensing, Morse code generator, proximity sensing, and pulse wave sensing. 
    more » « less