skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Friday, December 13 until 2:00 AM ET on Saturday, December 14 due to maintenance. We apologize for the inconvenience.


Title: ShapeMap 3-D: Efficient shape mapping through dense touch and vision
Knowledge of 3-D object shape is of great importance to robot manipulation tasks, but may not be readily available in unstructured environments. While vision is often occluded during robot-object interaction, high-resolution tactile sensors can give a dense local perspective of the object. However, tactile sensors have limited sensing area and the shape representation must faithfully approximate non-contact areas. In addition, a key challenge is efficiently incorporating these dense tactile measurements into a 3-D mapping framework. In this work, we propose an incremental shape mapping method using a GelSight tactile sensor and a depth camera. Local shape is recovered from tactile images via a learned model trained in simulation. Through efficient inference on a spatial factor graph informed by a Gaussian process, we build an implicit surface representation of the object. We demonstrate visuo-tactile mapping in both simulated and real-world experiments, to incrementally build 3-D reconstructions of household objects.  more » « less
Award ID(s):
2008279
PAR ID:
10388410
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
IEEE International Conference on Robotics and Automation
Page Range / eLocation ID:
7073 to 7080
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. During in-hand manipulation, robots must be able to continuously estimate the pose of the object in order to generate appropriate control actions. The performance of algorithms for pose estimation hinges on the robot's sensors being able to detect discriminative geometric object features, but previous sensing modalities are unable to make such measurements robustly. The robot's fingers can occlude the view of environment- or robot-mounted image sensors, and tactile sensors can only measure at the local areas of contact. Motivated by fingertip-embedded proximity sensors' robustness to occlusion and ability to measure beyond the local areas of contact, we present the first evaluation of proximity sensor based pose estimation for in-hand manipulation. We develop a novel two-fingered hand with fingertip-embedded optical time-of-flight proximity sensors as a testbed for pose estimation during planar in-hand manipulation. Here, the in-hand manipulation task consists of the robot moving a cylindrical object from one end of its workspace to the other. We demonstrate, with statistical significance, that proximity-sensor based pose estimation via particle filtering during in-hand manipulation: a) exhibits 50% lower average pose error than a tactile-sensor based baseline; b) empowers a model predictive controller to achieve 30% lower final positioning error compared to when using tactile-sensor based pose estimates. 
    more » « less
  2. null (Ed.)
    The connection between visual input and tactile sensing is critical for object manipulation tasks such as grasping and pushing. In this work, we introduce the challenging task of estimating a set of tactile physical properties from visual information. We aim to build a model that learns the complex mapping between visual information and tactile physical properties. We construct a first of its kind image-tactile dataset with over 400 multiview image sequences and the corresponding tactile properties. A total of fifteen tactile physical properties across categories including friction, compliance, adhesion, texture, and thermal conductance are measured and then estimated by our models. We develop a cross-modal framework comprised of an adversarial objective and a novel visuo-tactile joint classification loss. Additionally, we introduce a neural architecture search framework capable of selecting optimal combinations of viewing angles for estimating a given physical property. 
    more » « less
  3. This paper introduces a vision-based tactile sensor FingerVision, and explores its usefulness in tactile behaviors. FingerVision consists of a transparent elastic skin marked with dots, and a camera that is easy to fabricate, low cost, and physically robust. Unlike other vision-based tactile sensors, the complete transparency of the FingerVision skin provides multimodal sensation. The modalities sensed by FingerVision include distributions of force and slip, and object information such as distance, location, pose, size, shape, and texture. The slip detection is very sensitive since it is obtained by computer vision directly applied to the output from the FingerVision camera. It provides high-resolution slip detection, which does not depend on the contact force, i.e., it can sense slip of a lightweight object that generates negligible contact force. The tactile behaviors explored in this paper include manipulations that utilize this feature. For example, we demonstrate that grasp adaptation with FingerVision can grasp origami, and other deformable and fragile objects such as vegetables, fruits, and raw eggs. 
    more » « less
  4. Robotic grasping is successful when a robot can sense and grasp an object without letting it slip. Beyond industrial robotic tasks, there are two main robotic grasping methods. The first is planning-based grasping where the object geometry is known beforehand and stable grasps are calculated using algorithms [1]. The second uses tactile feedback. Currently, there are capacitive sensors placed beneath stiff pads on the front of robotic fingers [2]. With post-execution grasp adjustment procedures to estimate grasp stability, a support vector machine classifier can distinguish stable and unstable grasps. The accuracy across the classes of tested objects is 81% [1]. We are proposing to improve the classifier's accuracy by wrapping flexible sensors around the robotic finger to gain information from the edges and sides of the finger. 
    more » « less
  5. Madden, John D. ; Anderson, Iain A. ; Shea, Herbert R. (Ed.)
    Current robotic sensing is mainly visual, which is useful up until the point of contact. To understand how an object is being gripped, tactile feedback is needed. Human grasp is gentle yet firm, with integrated tactile touch feedback. Ras Labs makes Synthetic Muscle™, which is a class of electroactive polymer (EAP) based materials and actuators that sense pressure from gentle touch to high impact, controllably contract and expand at low voltage (battery levels), and attenuate force. The development of this technology towards sensing has provided for fingertip-like sensors that were able to detect very light pressures down to 0.01 N and even 0.005 N, with a wide pressure range to 25 N and more and with high linearity. By using these soft yet robust Tactile Fingertip™ sensors, immediate feedback was generated at the first point of contact. Because these elastomeric pads provided a soft compliant interface, the first point of contact did not apply excessive force, allowing for gentle object handling and control of the force applied to the object. The Tactile Fingertip could also detect a change in pressure location on its surface, i.e., directional glide provided real time feedback, making it possible to detect and prevent slippage by then adjusting the grip strength. Machine learning (ML) and artificial intelligence (AI) were integrated into these sensors for object identification along with the determination of good grip (position, grip force, no slip, no wobble) for pick-and-place and other applications. Synthetic Muscle™ is also being retrofitted as actuators into a human hand-like biomimetic gripper. The combination of EAP shape-morphing and sensing promises the potential for robotic grippers with human hand-like control and tactile sensing. This is expected to advance robotics, whether it is for agriculture, medical surgery, therapeutic or personal care, or in extreme environments where humans cannot enter, including with contagions that have no cure, as well as for collaborative robotics to allow humans and robots to intuitively work safely and effectively together. 
    more » « less