skip to main content


Title: Pose Measurement and Contact Training of a Fabric-Reinforced Inflatable Soft Robot
This paper proposes a new method to measure the pose and localize the contacts with the surrounding environment for an inflatable soft robot by using optical sensors (photocells), inertial measurement units (IMUs), and a pressure sensor. These affordable sensors reside entirely aboard the robot and will be effective in environments where external sensors, such as motion capture, are not feasible to use. The entire bore of the robot is used as a waveguide to transfer the light. When the robot is working, the photocell signals vary with the current shape of the robot and the IMUs measure the orientation of its tip. Analytical functions are developed to relate the photocell signals and the robot pose. Since the soft robot is deformable, the occurrence of contact at any location on its body will modify the sensor signals. This simple measurement approach generates enough information to allow contact events to be detected and classified with high precision using a machine learning algorithm.  more » « less
Award ID(s):
1935312
NSF-PAR ID:
10480904
Author(s) / Creator(s):
;
Publisher / Repository:
IEEE
Date Published:
Journal Name:
Proceedings if the SICE/IEEE International Symposium on System Integration (SII2023)
ISSN:
2474-2325
Page Range / eLocation ID:
1 to 6
Format(s):
Medium: X
Location:
Atlanta, GA, USA
Sponsoring Org:
National Science Foundation
More Like this
  1. During in-hand manipulation, robots must be able to continuously estimate the pose of the object in order to generate appropriate control actions. The performance of algorithms for pose estimation hinges on the robot's sensors being able to detect discriminative geometric object features, but previous sensing modalities are unable to make such measurements robustly. The robot's fingers can occlude the view of environment- or robot-mounted image sensors, and tactile sensors can only measure at the local areas of contact. Motivated by fingertip-embedded proximity sensors' robustness to occlusion and ability to measure beyond the local areas of contact, we present the first evaluation of proximity sensor based pose estimation for in-hand manipulation. We develop a novel two-fingered hand with fingertip-embedded optical time-of-flight proximity sensors as a testbed for pose estimation during planar in-hand manipulation. Here, the in-hand manipulation task consists of the robot moving a cylindrical object from one end of its workspace to the other. We demonstrate, with statistical significance, that proximity-sensor based pose estimation via particle filtering during in-hand manipulation: a) exhibits 50% lower average pose error than a tactile-sensor based baseline; b) empowers a model predictive controller to achieve 30% lower final positioning error compared to when using tactile-sensor based pose estimates. 
    more » « less
  2. The trend toward soft wearable robotic systems creates a compelling need for new and reliable sensor systems that do not require a rigid mounting frame. Despite the growing use of inertial measurement units (IMUs) in motion tracking applications, sensor drift and IMU-to-segment misalignment still represent major problems in applications requiring high accuracy. This paper proposes a novel 2-step calibration method which takes advantage of the periodic nature of human locomotion to improve the accuracy of wearable inertial sensors in measuring lower-limb joint angles. Specifically, the method was applied to the determination of the hip joint angles during walking tasks. The accuracy and precision of the calibration method were accessed in a group of N = 8 subjects who walked with a custom-designed inertial motion capture system at 85% and 115% of their comfortable pace, using an optical motion capture system as reference. In light of its low computational complexity and good accuracy, the proposed approach shows promise for embedded applications, including closed-loop control of soft wearable robotic systems. 
    more » « less
  3. null (Ed.)
    Due to their ability to move without sliding relative to their environment, soft growing robots are attractive for deploying distributed sensor networks in confined spaces. Sensing of the state of such robots would add to their capabilities as human-safe, adaptable manipulators. However, incorporation of distributed sensors onto soft growing robots is challenging because it requires an interface between stiff and soft materials, and the sensor network needs to undergo significant strain. In this work, we present a method for adding sensors to soft growing robots that uses flexible printed circuit boards with self-contained units of microcontrollers and sensors encased in a laminate armor that protects them from unsafe curvatures. We demonstrate the ability of this system to relay directional temperature and humidity information in hard-to-access spaces. We also demonstrate and characterize a method for sensing the growing robot shape using inertial measurement units deployed along its length, and develop a mathematical model to predict its accuracy. This work advances the capabilities of soft growing robots, as well as the field of soft robot sensing. 
    more » « less
  4. Abstract Sensing for wearable robots is an ongoing challenge, especially given the recent trend of soft and compliant robots. Recently, a wearable origami exoshell has been designed to sense the user’s torso motion and provide mobility assistance. The materials of the exoshell contribute to a lightweight design with compliant joints, which are ideal characteristics for a wearable device. Common sensors are not ideal for the exoshell as they compromise these design characteristics. Rotary encoders are often rigid metal devices that add considerable weight and compromise the flexibility of the joints. Inertial measurement unit sensors are affected by environments with variable electromagnetic fields and therefore not ideal for wearable applications. Hall effect sensors and gyroscopes are utilized as alternative compatible sensors, which introduce their own set of challenges: noisy measurements and drift due to sensor bias. To mitigate this, we designed the Kinematically Constrained Kalman filter for sensor fusion of gyroscopes and Hall effect sensors, with the goal of estimating the human’s torso and robot joint angles. We augmented the states to consider bias related to the torso angle in order to compensate for drift. The forward kinematics of the robot is incorporated into the Kalman filter as state constraints to address the unobservability of the torso angle and its related bias. The proposed algorithm improved the estimation performance of the torso angle and its bias, compared to the individual sensors and the standard Kalman filter, as demonstrated through bench tests and experiments with a human user. 
    more » « less
  5. Across a plethora of social situations, we touch others in natural and intuitive ways to share thoughts and emotions, such as tapping to get one’s attention or caressing to soothe one’s anxiety. A deeper understanding of these human-to-human interactions will require, in part, the precise measurement of skin-to-skin physical contact. Among prior efforts, each measurement approach exhibits certain constraints, e.g., motion trackers do not capture the precise shape of skin surfaces, while pressure sensors impede skin-to-skin contact. In contrast, this work develops an interference-free 3D visual tracking system using a depth camera to measure the contact attributes between the bare hand of a toucher and the forearm of a receiver. The toucher’s hand is tracked as a posed and positioned mesh by fitting a hand model to detected 3D hand joints, whereas a receiver’s forearm is extracted as a 3D surface updated upon repeated skin contact. Based on a contact model involving point clouds, the spatiotemporal changes of hand-to-forearm contact are decomposed as six, high-resolution, time-series contact attributes, i.e., contact area, indentation depth, absolute velocity, and three orthogonal velocity components, together with contact duration. To examine the system’s capabilities and limitations, two types of experiments were performed. First, to evaluate its ability to discern human touches, one person delivered cued social messages, e.g., happiness, anger, sympathy, to another person using their preferred gestures. The results indicated that messages and gestures, as well as the identities of the touchers, were readily discerned from their contact attributes. Second, the system’s spatiotemporal accuracy was validated against measurements from independent devices, including an electromagnetic motion tracker, sensorized pressure mat, and laser displacement sensor. While validated here in the context of social communication, this system is extendable to human touch interactions such as maternal care of infants and massage therapy. 
    more » « less