skip to main content


Title: An Open-Source Framework for Rapid Development of Interactive Soft-Body Simulations for Real-Time Training
We present an open-source framework that provides a low barrier to entry for real-time simulation, visualization, and interactive manipulation of user-specifiable soft-bodies, environments, and robots (using a human-readable front-end interface). The simulated soft-bodies can be interacted by a variety of input interface devices including commercially available haptic devices, game controllers, and the Master Tele-Manipulators (MTMs) of the da Vinci Research Kit (dVRK) with real-time haptic feedback. We propose this framework for carrying out multi-user training, user-studies, and improving the control strategies for manipulation problems. In this paper, we present the associated challenges to the development of such a framework and our proposed solutions. We also demonstrate the performance of this framework with examples of soft-body manipulation and interaction with various input devices.  more » « less
Award ID(s):
1637759 1927275
NSF-PAR ID:
10356550
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
2020 IEEE International Conference on Robotics and Automation (ICRA)
Page Range / eLocation ID:
6544 to 6550
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Surgical robots for laparoscopy consist of several patient side slave manipulators that are controlled via surgeon operated master telemanipulators. Commercial surgical robots do not perform any sub-tasks - even of repetitive or noninvasive nature - autonomously or provide intelligent assistance. While this is primarily due to safety and regulatory reasons, the state of such automation intelligence also lacks the reliability and robustness for use in high-risk applications. Recent developments in continuous control using Artificial Intelligence and Reinforcement Learning have prompted growing research interest in automating mundane sub-tasks. To build on this, we present an inspired Asynchronous Framework which incorporates realtime dynamic simulation - manipulable with the masters of a surgical robot and various other input devices - and interfaces with learning agents to train and potentially allow for the execution of shared sub-tasks. The scope of this framework is generic to cater to various surgical (as well as non-surgical) training and control applications. This scope is demonstrated by examples of multi-user and multi-manual applications which allow for realistic interactions by incorporating distributed control, shared task allocation and a well-defined communication pipe-line for learning agents. These examples are discussed in conjunction with the design philosophy, specifications, system-architecture and metrics of the Asynchronous Framework and the accompanying Simulator. We show the stability of Simulator while achieving real-time dynamic simulation and interfacing with several haptic input devices and a training agent at the same time. 
    more » « less
  2. Vibration is ubiquitous as a mode of haptic communication, and is used widely in handheld devices to convey events and notifications. The miniaturization of electromechanical actuators that are used to generate these vibrations has enabled designers to embed such actuators in wearable devices, conveying vibration at the wrist and other locations on the body. However, the rigid housings of these actuators mean that such wearables cannot be fully soft and compliant at the interface with the user. Fluidic textile-based wearables offer an alternative mechanism for haptic feedback in a fabric-like form factor. To our knowledge, fluidically driven vibrotactile feedback has not been demonstrated in a wearable device without the use of valves, which can only enable low-frequency vibration cues and detract from wearability due to their rigid structure. We introduce a soft vibrotactile wearable, made of textile and elastomer, capable of rendering high-frequency vibration. We describe our design and fabrication methods and the mechanism of vibration, which is realized by controlling inlet pressure and harnessing a mechanical hysteresis. We demonstrate that the frequency and amplitude of vibration produced by our device can be varied based on changes in the input pressure, with 0.3 to 1.4 bar producing vibrations that range between 160 and 260 Hz at 13 to 38 g, the acceleration due to gravity. Our design allows for controllable vibrotactile feedback that is comparable in frequency and outperforms in amplitude relative to electromechanical actuators, yet has the compliance and conformity of fully soft wearable devices. 
    more » « less
  3. Haptic interfaces can be used to add sensations of touch to virtual and augmented reality experiences. Soft, flexible devices that deliver spatiotemporal patterns of touch across the body, potentially with full-body coverage, are of particular interest for a range of applications in medicine, sports and gaming. Here we report a wireless haptic interface of this type, with the ability to display vibro-tactile patterns across large areas of the skin in single units or through a wirelessly coordinated collection of them. The lightweight and flexible designs of these systems incorporate arrays of vibro-haptic actuators at a density of 0.73 actuators per square centimetre, which exceeds the two-point discrimination threshold for mechanical sensation on the skin across nearly all the regions of the body except the hands and face. A range of vibrant sensations and information content can be passed to mechanoreceptors in the skin via time-dependent patterns and amplitudes of actuation controlled through the pressure-sensitive touchscreens of smart devices, in real-time with negligible latency. We show that this technology can be used to convey navigation instructions, to translate musical tracks into tactile patterns and to support sensory replacement feedback for the control of robotic prosthetics. 
    more » « less
  4. Handheld kinesthetic haptic interfaces can provide greater mobility and richer tactile information as compared to traditional grounded devices. In this paper, we introduce a new handheld haptic interface which takes input using bidirectional coupled finger flexion. We present the device design motivation and design details and experimentally evaluate its performance in terms of transparency and rendering bandwidth using a handheld prototype device. In addition, we assess the device's functional performance through a user study comparing the proposed device to a commonly used grounded input device in a set of targeting and tracking tasks. 
    more » « less
  5. Abstract

    Since the modern concepts for virtual and augmented reality are first introduced in the 1960's, the field has strived to develop technologies for immersive user experience in a fully or partially virtual environment. Despite the great progress in visual and auditory technologies, haptics has seen much slower technological advances. The challenge is because skin has densely packed mechanoreceptors distributed over a very large area with complex topography; devising an apparatus as targeted as an audio speaker or television for the localized sensory input of an ear canal or iris is more difficult. Furthermore, the soft and sensitive nature of the skin makes it difficult to apply solid state electronic solutions that can address large areas without causing discomfort. The maturing field of soft robotics offers potential solutions toward this challenge. In this article, the definition and history of virtual (VR) and augmented reality (AR) is first reviewed. Then an overview of haptic output and input technologies is presented, opportunities for soft robotics are identified, and mechanisms of intrinsically soft actuators and sensors are introduced. Finally, soft haptic output and input devices are reviewed with categorization by device forms, and examples of soft haptic devices in VR/AR environments are presented.

     
    more » « less