skip to main content


Title: Improved Proximity, Contact, and Force Sensing via Optimization of Elastomer-Air Interface Geometry
We describe a single fingertip-mounted sensing system for robot manipulation that provides proximity (pre-touch), contact detection (touch), and force sensing (post-touch). The sensor system consists of optical time-of-flight range measurement modules covered in a clear elastomer. Because the elastomer is clear, the sensor can detect and range nearby objects, as well as measure deformations caused by objects that are in contact with the sensor and thereby estimate the applied force. We examine how this sensor design can be improved with respect to invariance to object reflectivity, signal-to-noise ratio, and continuous operation when switching between the distance and force measurement regimes. By harnessing time-of-flight technology and optimizing the elastomer-air boundary to control the emitted light's path, we develop a sensor that is able to seamlessly transition between measuring distances of up to 50 mm and contact forces of up to 10 newtons. We demonstrate that our sensor improves manipulation accuracy in a block unstacking task. Thorough instructions for manufacturing the sensor from inexpensive, commercially available components are provided, as well as all relevant hardware design files and software sources.  more » « less
Award ID(s):
1832795
NSF-PAR ID:
10127832
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
2019 International Conference on Robotics and Automation (ICRA)
Page Range / eLocation ID:
3797 to 3803
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The most common sensing modalities found in a robot perception system are vision and touch, which together can provide global and highly localized data for manipulation. However, these sensing modalities often fail to adequately capture the behavior of target objects during the critical moments as they transition out of static, controlled contact with an end-effector to dynamic and uncontrolled motion. In this work, we present a novel multimodal visuotactile sensor that provides simultaneous visuotactile and proximity depth data. The sensor integrates an RGB camera and air pressure sensor to sense touch with an infrared time-of-flight (ToF) camera to sense proximity by leveraging a selectively transmissive soft membrane to enable the dual sensing modalities. We present the mechanical design, fabrication techniques, algorithm implementations, and evaluation of the sensor's tactile and proximity modalities. The sensor is demonstrated in three open-loop robotic tasks: approaching and contacting an object, catching, and throwing. The fusion of tactile and proximity data could be used to capture key information about a target object's transition behavior for sensor-based control in dynamic manipulation. 
    more » « less
  2. We present iSoft, a single volume soft sensor capable of sensing real-time continuous contact and unidirectional stretching. We propose a low-cost and an easy way to fabricate such piezoresistive elastomer-based soft sensors for instant interactions. We employ an electrical impedance tomography (EIT) technique to estimate changes of resistance distribution on the sensor caused by fingertip contact. To compensate for the rebound elasticity of the elastomer and achieve real-time continuous contact sensing, we apply a dynamic baseline update for EIT. The baseline updates are triggered by fingertip contact and movement detections. Further, we support unidirectional stretching sensing using a model-based approach which works separately with continuous contact sensing. We also provide a software toolkit for users to design and deploy personalized interfaces with customized sensors. Through a series of experiments and evaluations, we validate the performance of contact and stretching sensing. Through example applications, we show the variety of examples enabled by iSoft. 
    more » « less
  3. In the future, human-robot interaction will include collaboration in close-quarters where the environment geometry is partially unknown. As a means for enabling such interaction, this paper presents a multi-modal sensor array capable of contact detection and localization, force sensing, proximity sensing, and mapping. The sensor array integrates Hall effect and time-of-flight (ToF) sensors in an I 2 C communication network. The design, fabrication, and characterization of the sensor array for a future in-situ collaborative continuum robot are presented. Possible perception benefits of the sensor array are demonstrated for accidental contact detection, mapping of the environment, selection of admissible zones for bracing, and constrained motion control of the end effector while maintaining a bracing constraint with an admissible rolling motion. 
    more » « less
  4. In the future, human-robot interaction will include collaboration in close-quarters where the environment geometry is partially unknown. As a means for enabling such interaction, this paper presents a multi-modal sensor array capable of contact detection and localization, force sensing, proximity sensing, and mapping. The sensor array integrates Hall effect and time-of-flight (ToF) sensors in an I2C communication network. The design, fabrication, and characterization of the sensor array for a future in-situ collaborative continuum robot are presented. Possible perception benefits of the sensor array are demonstrated for accidental contact detection, mapping of the environment, selection of admissible zones for bracing, and constrained motion control of the end effector while maintaining a bracing constraint with an admissible rolling motion. 
    more » « less
  5. This paper presents methods for improved teleoperation in dynamic environments in which the objects to be manipulated are moving, but vision may not meet size, biocompatibility, or maneuverability requirements. In such situations, the object could be tracked through non-geometric means, such as heat, radioactivity, or other markers. In order to safely explore a region, we use an optical time-of-flight pretouch sensor to detect (and range) target objects prior to contact. Information from these sensors is presented to the user via haptic virtual fixtures. This combination of techniques allows the teleoperator to “feel” the object without an actual contact event between the robot and the target object. Thus it provides the perceptual benefits of touch interaction to the operator, without incurring the negative consequences of the robot contacting unknown geometrical structures; premature contact can lead to damage or unwanted displacement of the target. The authors propose that as the geometry of the scene transitions from completely unknown to partially explored, haptic virtual fixtures can both prevent collisions and guide the user towards areas of interest, thus improving exploration speed. Experimental results show that for situations that are not amenable to vision, haptically-presented pretouch sensor information allows operators to more effectively explore moving objects. 
    more » « less