skip to main content


Title: Data–Driven Disturbance Observers for Estimating External Forces on Soft Robots,
Unlike traditional robots, soft robots can intrinsi- cally interact with their environment in a continuous, robust, and safe manner. These abilities - and the new opportunities they open - motivate the development of algorithms that provide reliable information on the nature of environmental interactions and, thereby, enable soft robots to reason on and properly react to external contact events. However, directly extracting such information with integrated sensors remains an arduous task that is further complicated by also needing to sense the soft robot’s configuration. As an alternative to direct sensing, this paper addresses the challenge of estimating contact forces directly from the robot’s posture. We propose a new technique that merges a nominal disturbance observer, a model-based component, with corrections learned from data. The result is an algorithm that is accurate yet sample efficient, and one that can reliably estimate external contact events with the environment. We prove the convergence of our proposed method analytically, and we demonstrate its performance with simulations and physical experiments.  more » « less
Award ID(s):
1830901
NSF-PAR ID:
10188443
Author(s) / Creator(s):
Date Published:
Journal Name:
IEEE Robotics and Automation Letters
Volume:
5
Issue:
4
Page Range / eLocation ID:
5717-5724
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Soft robots have shown great potential to enable safe interactions with unknown environments due to their inherent compliance and variable stiffness. However, without knowledge of potential contacts, a soft robot could exhibit rigid behaviors in a goal-reaching task and collide into obstacles. In this paper, we introduce a Sliding Mode Augmented by Reactive Transitioning (SMART) controller to detect the contact events, adjust the robot’s desired trajectory, and reject estimated disturbances in a goal reaching task. We employ a sliding mode controller to track the desired trajectory with a nonlinear disturbance observer (NDOB) to estimate the lumped disturbance, and a switching algorithm to adjust the desired robot trajectories. The proposed controller is validated on a pneumatic-driven fabric soft robot whose dynamics is described by a new extended rigid-arm model to fit the actuator design. A stability analysis of the proposed controller is also presented. Experimental results show that, despite modeling uncertainties, the robot can detect obstacles, adjust the reference trajectories to maintain compliance, and recover to track the original desired path once the obstacle is removed. Without force sensors, the proposed model-based controller can adjust the robot’s stiffness based on the estimated disturbance to achieve goal reaching and compliant interaction with unknown obstacles. 
    more » « less
  2. Abstract Soft robots can undergo large elastic deformations and adapt to complex shapes. However, they lack the structural strength to withstand external loads due to the intrinsic compliance of fabrication materials (silicone or rubber). In this paper, we present a novel stiffness modulation approach that controls the robot’s stiffness on-demand without permanently affecting the intrinsic compliance of the elastomeric body. Inspired by concentric tube robots, this approach uses a Nitinol tube as the backbone, which can be slid in and out of the soft robot body to achieve robot pose or stiffness modulation. To validate the proposed idea, we fabricated a tendon-driven concentric tube (TDCT) soft robot and developed the model based on Cosserat rod theory. The model is validated in different scenarios by varying the joint-space tendon input and task-space external contact force. Experimental results indicate that the model is capable of estimating the shape of the TDCT soft robot with an average root-mean-square error (RMSE) of 0.90 (0.56% of total length) mm and average tip error of 1.49 (0.93% of total length) mm. Simulation studies demonstrate that the Nitinol backbone insertion can enhance the kinematic workspace and reduce the compliance of the TDCT soft robot by 57.7%. Two case studies (object manipulation and soft laparoscopic photodynamic therapy) are presented to demonstrate the potential application of the proposed design. 
    more » « less
  3. null (Ed.)
    Soft, tip-extending, pneumatic “vine robots” that grow via eversion are well suited for navigating cluttered environments. Two key mechanisms that add to the robot’s functionality are a tip-mounted retraction device that allows the growth process to be reversed, and a tip-mounted camera that enables vision. However, previous designs used rigid, relatively heavy electromechanical retraction devices and external camera mounts, which reduce some advantages of these robots. These designs prevent the robot from squeezing through tight gaps, make it challenging to lift the robot tip against gravity, and require the robot to drag components against the environment. To address these limitations, we present a soft, pneumatically driven retraction device and an internal camera mount that are both lightweight and smaller than the diameter of the robot. The retraction device is composed of a soft, extending pneumatic actuator and a pair of soft clamping actuators that work together in an inch-worming motion. The camera mount sits inside the robot body and is kept at the tip of the robot by two low-friction interlocking components. We present characterizations of our retraction device and demonstrations that the robot can grow and retract through turns, tight gaps, and sticky environments while transmitting live video from the tip. Our designs advance the ability of everting vine robots to navigate difficult terrain while collecting data. 
    more » « less
  4. null (Ed.)
    Soft isoperimetric truss robots have demonstrated an ability to grasp and manipulate objects using the members of their structure. The compliance of the members affords large contact areas with even force distribution, allowing for successful grasping even with imprecise open-loop control. In this work we present methods of analyzing and controlling isoperimetric truss robots in the context of grasping and manipulating objects. We use a direct stiffness model to characterize the structural properties of the robot and its interactions with external objects. With this approach we can estimate grasp forces and stiffnesses with limited computation compared to higher fidelity finite elements methods, which, given the many degrees-of-freedom of truss robots, are prohibitively expensive to run on-board. In conjunction with the structural model, we build upon a literature of differential kinematics for truss robots and apply it to the task of manipulating an object within the robot’s workspace. 
    more » « less
  5. Physical interaction between humans and robots can help robots learn to perform complex tasks. The robot arm gains information by observing how the human kinesthetically guides it throughout the task. While prior works focus on how the robot learns, it is equally important that this learning is transparent to the human teacher. Visual displays that show the robot’s uncertainty can potentially communicate this information; however, we hypothesize that visual feedback mechanisms miss out on the physical connection between the human and robot. In this work we present a soft haptic display that wraps around and conforms to the surface of a robot arm, adding a haptic signal at an existing point of contact without significantly affecting the interaction. We demonstrate how soft actuation creates a salient haptic signal while still allowing flexibility in device mounting. Using a psychophysics experiment, we show that users can accurately distinguish inflation levels of the wrapped display with an average Weber fraction of 11.4%. When we place the wrapped display around the arm of a robotic manipulator, users are able to interpret and leverage the haptic signal in sample robot learning tasks, improving identification of areas where the robot needs more training and enabling the user to provide better demonstrations. See videos of our device and user studies here: https://youtu.be/tX-2Tqeb9Nw 
    more » « less