skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: GradTac: Spatio-Temporal Gradient Based Tactile Sensing
Tactile sensing for robotics is achieved through a variety of mechanisms, including magnetic, optical-tactile, and conductive fluid. Currently, the fluid-based sensors have struck the right balance of anthropomorphic sizes and shapes and accuracy of tactile response measurement. However, this design is plagued by a low Signal to Noise Ratio (SNR) due to the fluid based sensing mechanism “damping” the measurement values that are hard to model. To this end, we present a spatio-temporal gradient representation on the data obtained from fluid-based tactile sensors, which is inspired from neuromorphic principles of event based sensing. We present a novel algorithm (GradTac) that converts discrete data points from spatial tactile sensors into spatio-temporal surfaces and tracks tactile contours across these surfaces. Processing the tactile data using the proposed spatio-temporal domain is robust, makes it less susceptible to the inherent noise from the fluid based sensors, and allows accurate tracking of regions of touch as compared to using the raw data. We successfully evaluate and demonstrate the efficacy of GradTac on many real-world experiments performed using the Shadow Dexterous Hand, equipped with the BioTac SP sensors. Specifically, we use it for tracking tactile input across the sensor’s surface, measuring relative forces, detecting linear and rotational slip, and for edge tracking. We also release an accompanying task-agnostic dataset for the BioTac SP, which we hope will provide a resource to compare and quantify various novel approaches, and motivate further research.  more » « less
Award ID(s):
1824198 2020624
PAR ID:
10376845
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
Frontiers in Robotics and AI
Volume:
9
ISSN:
2296-9144
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Scalable, high-density electronic skins (e-skins) are a desirable goal of tactile sensing. However, a realization of this goal has been elusive due to the trade-off between spatial and temporal resolution that current tactile sensors suffer from. Additionally, as tactile sensing grids become large, wiring becomes unmanageable, and there is a need for a wireless approach. In this work, a scalable, event-based, passive tactilesensing system is proposed that is based on radio-frequency identification (RFID) technology. An RFID-based tactile sensing hand is developed with 19 pressure sensing taxels. The taxels are read wirelessly using a single ‘hand-shaped’ RFID antenna. Each RFID tag is transformed into a pressure sensor by disconnecting the RFID chip from its antenna and embedding the chip and antenna into soft elastomer with an air gap introduced between the RFID chip and its antenna. When a pressure event occurs, the RFID chip contacts its antenna and receives power and communicates with the RFID reader. Thus, the sensor is transformed into a biomimetic event-based sensor, whose response is activated only when used. Further, this work demonstrates the feasibility of constructing event-based, passive sensing grids that can be read wirelessly. Future tactile sensing e-skins can utilize this approach to become scalable and dense, while retaining high temporal resolution. Moreover, this approach can be applied beyond tactile sensing, for the development of scalable and high-density sensors of any modality. 
    more » « less
  2. In this work we show how we can build a technology platform for cognitive imaging sensors using recent advances in recurrent neural network architectures and training methods inspired from biology. We demonstrate learning and processing tasks specific to imaging sensors, including enhancement of sensitivity and signal-to-noise ratio (SNR) purely through neural filtering beyond the fundamental limits sensor materials, and inferencing and spatio-temporal pattern recognition capabilities of these networks with applications in object detection, motion tracking and prediction. We then show designs of unit hardware cells built using complementary metal-oxide semiconductor (CMOS) and emerging materials technologies for ultra-compact and energy-efficient embedded neural processors for smart cameras. 
    more » « less
  3. During in-hand manipulation, robots must be able to continuously estimate the pose of the object in order to generate appropriate control actions. The performance of algorithms for pose estimation hinges on the robot's sensors being able to detect discriminative geometric object features, but previous sensing modalities are unable to make such measurements robustly. The robot's fingers can occlude the view of environment- or robot-mounted image sensors, and tactile sensors can only measure at the local areas of contact. Motivated by fingertip-embedded proximity sensors' robustness to occlusion and ability to measure beyond the local areas of contact, we present the first evaluation of proximity sensor based pose estimation for in-hand manipulation. We develop a novel two-fingered hand with fingertip-embedded optical time-of-flight proximity sensors as a testbed for pose estimation during planar in-hand manipulation. Here, the in-hand manipulation task consists of the robot moving a cylindrical object from one end of its workspace to the other. We demonstrate, with statistical significance, that proximity-sensor based pose estimation via particle filtering during in-hand manipulation: a) exhibits 50% lower average pose error than a tactile-sensor based baseline; b) empowers a model predictive controller to achieve 30% lower final positioning error compared to when using tactile-sensor based pose estimates. 
    more » « less
  4. Abstract This paper focuses on the modeling and development of engineered ionic polymer-metal composite (eIPMC) sensors for applications such as postural and tactile measurement in mechatronics/robotics-assisted finger rehabilitation therapy. Specifically, to tailor the sensitivity of the device, eIPMCs, fabricated using a polymer-surface abrading technique, are utilized as the sensing element. An enhanced chemoelectromechanical model is developed that captures the effect of the abrading process on the multiphysics sensing behavior under different loading conditions. The fabricated sensors are characterized using scanning electron microscopy imaging and cyclic voltammetry and chronoamperometry. Results show significant improvement in the electrochemical properties, including charge storage, double layer capacitance, and surface conductance, compared to the control samples. Finally, prototype postural-tactile finger sensors composed of different eIPMC variants are created and their performance validated under postural and tactile experiments. The tailored eIPMC sensors show increased open-circuit voltage response compared to control IPMCs, with 7.7- and 4.7-times larger peak-to-peak bending response under postural changes, as well as a 3.2-times more sensitive response under compression during tactile loading, demonstrating the feasibility of eIPMC sensors. 
    more » « less
  5. Tactile sensing is pivotal for enhancing robot manipulation abilities by providing crucial feedback for localized information. However, existing sensors often lack the necessary resolution and bandwidth required for intricate tasks. To address this gap, we introduce VibTac, a novel multi-modal tactile sensing finger designed to offer high-resolution and high-bandwidth tactile sensing simultaneously. VibTac seamlessly integrates vision-based and vibration-based tactile sensing modes to achieve high-resolution and high-bandwidth tactile sensing respectively, leveraging a streamlined human-inspired design for versatility in tasks. This paper outlines the key design elements of VibTac and its fabrication methods, highlighting the significance of the Elastomer Gel Pad (EGP) in its sensing mechanism. The sensor’s multi-modal performance is validated through 3D reconstruction and spectral analysis to discern tactile stimuli effectively. In experimental trials, VibTac demonstrates its efficacy by achieving over 90% accuracy in insertion tasks involving objects emitting distinct sounds, such as ethernet connectors. Leveraging vision-based tactile sensing for object localization and employing a deep learning model for “click” sound classification, VibTac showcases its robustness in real-world scenarios. Video of the sensor working can be accessed at https://youtu.be/kmKIUlXGroo. 
    more » « less