skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Contact Localization for Transparent Robots using Velocity Constraints
Robots operating in unstructured environments must localize contact to detect and recover from failure. For example, Fig. 1 shows a Minitaur robot that must localize where it has unexpectedly contacted the stair’s edge so that it can properly step over it. We propose a kinematic method for proprioceptive contact localization using velocity measurements. The method is validated on two planar robots, the quadrupedal Minitaur and the DD Hand gripper, and compared to other state of the art proprioceptive methods. We further show that the method can be extended to spatial robots by fusing the candidate contact points over time with a particle filter.  more » « less
Award ID(s):
1813920
PAR ID:
10167948
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Proceedings of Dynamic Walking 2020
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Localizing contacts and collisions is an important aspect of failure detection and recovery for robots and can aid perception and exploration of the environment. Contrary to state-of-the-art methods that rely on forces and torques measured on the robot, this paper proposes a kinematic method for proprioceptive contact localization on compliant robots using velocity measurements. The method is validated on two planar robots, the quadrupedal Minitaur and the two-fingered Direct Drive (DD) Hand which are compliant due to inherent transparency from direct drive actuation. Comparisons to other state-of-the-art proprioceptive methods are shown in simulation. Preliminary results on further extensions to complex geometry (through numerical methods) and spatial robots (with a particle filter) are discussed. 
    more » « less
  2. Detecting and localizing contacts is essential for robot manipulators to perform contact-rich tasks in unstructured environments. While robot skins can localize contacts on the surface of robot arms, these sensors are not yet robust or easily accessible. As such, prior works have explored using proprioceptive observations, such as joint velocities and torques, to perform contact localization. Many past approaches assume the robot is static during contact incident, a single contact is made at a time, or having access to accurate dynamics models and joint torque sensing. In this work, we relax these assumptions and propose using Domain Randomization to train a neural network to localize contacts of robot arms in motion without joint torque observations. Our method uses a novel cylindrical projection encoding of the robot arm surface, which allows the network to use convolution layers to process input features and transposed convolution layers to predict contacts. The trained network achieves a contact detection accuracy of 91.5% and a mean contact localization error of 3.0cm. We further demonstrate an application of the contact localization model in an obstacle mapping task, evaluated in both simulation and the real world. 
    more » « less
  3. This paper reports on developing a real-time invariant proprioceptive robot state estimation framework called DRIFT. A didactic introduction to invariant Kalman filtering is provided to make this cutting-edge symmetry-preserving approach accessible to a broader range of robotics applications. Furthermore, this work dives into the development of a proprioceptive state estimation framework for dead reckoning that only consumes data from an onboard inertial measurement unit and kinematics of the robot, with two optional modules, a contact estimator and a gyro filter for low-cost robots, enabling a significant capability on a variety of robotics platforms to track the robot's state over long trajectories in the absence of perceptual data. Extensive real-world experiments using a legged robot, an indoor wheeled robot, a field robot, and a full-size vehicle, as well as simulation results with a marine robot, are provided to understand the limits of DRIFT. 
    more » « less
  4. Collaborative localization is an essential capability for a team of robots such as connected vehicles to collaboratively estimate object locations from multiple perspectives with reliant cooperation. To enable collaborative localization, four key challenges must be addressed, including modeling complex relationships between observed objects, fusing observations from an arbitrary number of collaborating robots, quantifying localization uncertainty, and addressing latency of robot communications. In this paper, we introduce a novel approach that integrates uncertainty-aware spatiotemporal graph learning and model-based state estimation for a team of robots to collaboratively localize objects. Specifically, we introduce a new uncertainty-aware graph learning model that learns spatiotemporal graphs to represent historical motions of the objects observed by each robot over time and provides uncertainties in object localization. Moreover, we propose a novel method for integrated learning and model-based state estimation, which fuses asynchronous observations obtained from an arbitrary number of robots for collaborative localization. We evaluate our approach in two collaborative object localization scenarios in simulations and on real robots. Experimental results show that our approach outperforms previous methods and achieves state-of-the-art performance on asynchronous collaborative localization. 
    more » « less
  5. Soft robots actuate themselves and their world through induced pressure and strain, and can often sense these quantities as well. We hypothesize that coordination in a tightly coupled collective of soft robots can be achieved with purely proprioceptive sensing and no direct communication. In this paper, we target a platform of soft pneumatic modules capable of sensing strain on their perimeter, with the goal of using only the robots' own soft actuators and sensors as a medium for distributed coordination. However, methods for modelling, sensing, and controlling strain in such soft robot collectives are not well understood. To address this challenge, we introduce and validate a computationally efficient spring-based model for two-dimensional sheets of soft pneumatic robots. We then translate a classical consensus algorithm to use only proprioceptive data, test in simulation, and show that due to the physical coupling between robots we can achieve consensus-like coordination. We discuss the unique challenges of strain sensors and next steps to bringing these findings to hardware. These findings have promising potential for smart materials and large-scale collectives, because they omit the need for additional communication infrastructure to support coordination. 
    more » « less