skip to main content


Title: Collaborative Suturing: A Reinforcement Learning Approach to Automate Hand-off Task in Suturing for Surgical Robots
Over the past decade, Robot-Assisted Surgeries (RAS), have become more prevalent in facilitating successful operations. Of the various types of RAS, the domain of collaborative surgery has gained traction in medical research. Prominent examples include providing haptic feedback to sense tissue consistency, and automating sub-tasks during surgery such as cutting or needle hand-off - pulling and reorienting the needle after insertion during suturing. By fragmenting suturing into automated and manual tasks the surgeon could essentially control the process with one hand and also circumvent workspace restrictions imposed by the control interface present at the surgeon's side during the operation. This paper presents an exploration of a discrete reinforcement learning-based approach to automate the needle hand-off task. Users were asked to perform a simple running suture using the da Vinci Research Kit. The user trajectory was learnt by generating a sparse reward function and deriving an optimal policy using Q-learning. Trajectories obtained from three learnt policies were compared to the user defined trajectory. The results showed a root-mean-square error of [0.0044mm, 0.0027mm, 0.0020mm] in ℝ 3 . Additional trajectories from varying initial positions were produced from a single policy to simulate repeated passes of the hand-off task.  more » « less
Award ID(s):
1637759 1927275
NSF-PAR ID:
10207764
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)
Page Range / eLocation ID:
1380 to 1386
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    An important problem in designing human-robot systems is the integration of human intent and performance in the robotic control loop, especially in complex tasks. Bimanual coordination is a complex human behavior that is critical in many fine motor tasks, including robot-assisted surgery. To fully leverage the capabilities of the robot as an intelligent and assistive agent, online recognition of bimanual coordination could be important. Robotic assistance for a suturing task, for example, will be fundamentally different during phases when the suture is wrapped around the instrument (i.e., making a c- loop), than when the ends of the suture are pulled apart. In this study, we develop an online recognition method of bimanual coordination modes (i.e., the directions and symmetries of right and left hand movements) using geometric descriptors of hand motion. We (1) develop this framework based on ideal trajectories obtained during virtual 2D bimanual path following tasks performed by human subjects operating Geomagic Touch haptic devices, (2) test the offline recognition accuracy of bi- manual direction and symmetry from human subject movement trials, and (3) evalaute how the framework can be used to characterize 3D trajectories of the da Vinci Surgical System’s surgeon-side manipulators during bimanual surgical training tasks. In the human subject trials, our geometric bimanual movement classification accuracy was 92.3% for movement direction (i.e., hands moving together, parallel, or away) and 86.0% for symmetry (e.g., mirror or point symmetry). We also show that this approach can be used for online classification of different bimanual coordination modes during needle transfer, making a C loop, and suture pulling gestures on the da Vinci system, with results matching the expected modes. Finally, we discuss how these online estimates are sensitive to task environment factors and surgeon expertise, and thus inspire future work that could leverage adaptive control strategies to enhance user skill during robot-assisted surgery. 
    more » « less
  2. Despite significant developments in the design of surgical robots and automated techniques for objective evalua- tion of surgical skills, there are still challenges in ensuring safety in robot-assisted minimally-invasive surgery (RMIS). This pa- per presents a runtime monitoring system for the detection of executional errors during surgical tasks through the analysis of kinematic data. The proposed system incorporates dual Siamese neural networks and knowledge of surgical context, including surgical tasks and gestures, their distributional similarities, and common error modes, to learn the differences between normal and erroneous surgical trajectories from small training datasets. We evaluate the performance of the error detection using Siamese networks compared to single CNN and LSTM networks trained with different levels of contextual knowledge and training data, using the dry-lab demonstrations of the Suturing and Needle Passing tasks from the JIGSAWS dataset. Our results show that gesture specific task nonspecific Siamese networks obtain micro F1 scores of 0.94 (Siamese-CNN) and 0.95 (Siamese-LSTM), and perform better than single CNN (0.86) and LSTM (0.87) networks. These Siamese networks also outperform gesture nonspecific task specific Siamese-CNN and Siamese-LSTM models for Suturing and Needle Passing. 
    more » « less
  3. Abstract Background

    Analysing kinematic and video data can help identify potentially erroneous motions that lead to sub‐optimal surgeon performance and safety‐critical events in robot‐assisted surgery.

    Methods

    We develop a rubric for identifying task and gesture‐specific executional and procedural errors and evaluate dry‐lab demonstrations of suturing and needle passing tasks from the JIGSAWS dataset. We characterise erroneous parts of demonstrations by labelling video data, and use distribution similarity analysis and trajectory averaging on kinematic data to identify parameters that distinguish erroneous gestures.

    Results

    Executional error frequency varies by task and gesture, and correlates with skill level. Some predominant error modes in each gesture are distinguishable by analysing error‐specific kinematic parameters. Procedural errors could lead to lower performance scores and increased demonstration times but also depend on surgical style.

    Conclusions

    This study provides insights into context‐dependent errors that can be used to design automated error detection mechanisms and improve training and skill assessment.

     
    more » « less
  4. Madden, John D. ; Anderson, Iain A. ; Shea, Herbert R. (Ed.)
    Ras Labs makes Synthetic Muscle™, which is a class of electroactive polymer (EAP) based materials and actuators that sense pressure (gentle touch to high impact), controllably contract and expand at low voltage (1.5 V to 50 V, including use of batteries), and attenuate force. We are in the robotics era, but robots do have their challenges. Currently, robotic sensing is mainly visual, which is useful up until the point of contact. To understand how an object is being gripped, tactile feedback is needed. For handling fragile objects, if the grip is too tight, breakage occurs, and if the grip is too loose, the object will slip out of the grasp, also leading to breakage. Rigid robotic grippers using a visual feedback loop can struggle to determine the exact point and quality of contact. Robotic grippers can also get a stuttering effect in the visual feedback loop. By using soft Synthetic Muscle™ based EAP pads as the sensors, immediate feedback was generated at the first point of contact. Because these pads provided a soft, compliant interface, the first point of contact did not apply excessive force, allowing the force applied to the object to be controlled. The EAP sensor could also detect a change in pressure location on its surface, making it possible to detect and prevent slippage by then adjusting the grip strength. In other words, directional glide provided feedback for the presence of possible slippage to then be able to control a slightly tighter grip, without stutter, due to both the feedback and the soft gentleness of the fingertip-like EAP pads themselves. The soft nature of the EAP fingertip pad also naturally held the gripped object, improving the gripping quality over rigid grippers without an increase in applied force. Analogous to finger-like tactile touch, the EAPs with appropriate coatings and electronics were positioned as pressure sensors in the fingertip or end effector regions of robotic grippers. This development of using Synthetic Muscle™ based EAPs as soft sensors provided for sensors that feel like the pads of human fingertips. Basic pressure position and magnitude tests have been successful, with pressure sensitivity down to 0.05 N. Most automation and robots are very strong, very fast, and usually need to be partitioned away from humans for safety reasons. For many repetitive tasks that humans do with delicate or fragile objects, it would be beneficial to use robotics; whether it is for agriculture, medical surgery, therapeutic or personal care, or in extreme environments where humans cannot enter, including with contagions that have no cure. Synthetic Muscle™ was also retrofitted as actuator systems into off-the-shelf robotic grippers and is being considered in novel biomimetic gripper designs, operating at low voltages (less than 50 V). This offers biomimetic movement by contracting like human muscles, but also exceeds natural biological capabilities by expanding under reversed electric polarity. Human grasp is gentle yet firm, with tactile touch feedback. In conjunction with shape-morphing abilities, these EAPs also are being explored to intrinsically sense pressure due to the correlation between mechanical force applied to the EAP and its electronic signature. The robotic field is experiencing phenomenal growth in this fourth phase of the industrial revolution, the robotics era. The combination of Ras Labs’ EAP shape-morphing and sensing features promises the potential for robotic grippers with human hand-like control and tactile sensing. This work is expected to advance both robotics and prosthetics, particularly for collaborative robotics to allow humans and robots to intuitively work safely and effectively together. 
    more » « less
  5. Advancements in robot-assisted surgery have been rapidly growing since two decades ago. More recently, the automation of robotic surgical tasks has become the focus of research. In this area, the detection and tracking of a surgical tool are crucial for an autonomous system to plan and perform a procedure. For example, knowing the position and posture of a needle is a prerequisite for an automatic suturing system to grasp it and perform suturing tasks. In this paper, we proposed a novel method, based on Deep Learning and Point-to-point Registration, to track the 6 degrees of freedom (DOF) pose of a metal suture needle from a robotic endoscope (an Endoscopic Camera Manipulator from the da Vinci Robotic Surgical Systems), without the help of any marker. The proposed approach was implemented and evaluated in a standard simulated surgical environment provided by the 2021–2022 AccelNet Surgical Robotics Challenge, thus demonstrates the potential to be translated into a real-world scenario. A customized dataset containing 836 images collected from the simulated scene with ground truth of poses and key points information was constructed to train the neural network model. The best pipeline achieved an average position error of 1.76 mm while the average orientation error is 8.55 degrees, and it can run up to 10 Hz on a PC. 
    more » « less