skip to main content


Search for: All records

Award ID contains: 2102250

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available April 19, 2024
  2. Training for robotic surgery can be challenging due the complexity of the technology, as well as a high demand for the robotic systems that must be primarily used for clinical care. While robotic surgical skills are traditionally trained using the robotic hardware coupled with physical simulated tissue models and test-beds, there has been an increasing interest in using virtual reality simulators. Use of virtual reality (VR) comes with some advantages, such as the ability to record and track metrics associated with learning. However, evidence of skill transfer from virtual environments to physical robotic tasks has yet to be fully demonstrated. In this work, we evaluate the effect of virtual reality pre-training on performance during a standardized robotic dry-lab training curriculum, where trainees perform a set of tasks and are evaluated with a score based on completion time and errors made during the task. Results show that VR pre-training is weakly significant ([Formula: see text]) in reducing the number of repetitions required to achieve proficiency on the robotic task; however, it is not able to significantly improve performance in any robotic tasks. This suggests that important skills are learned during physical training with the surgical robotic system that cannot yet be replaced with VR training. 
    more » « less
  3. null (Ed.)
    Visualization of hand movement is an important part of many haptic experiences. While some existing simulators allow for hand kinematic visualization using a generic hand model, they target robotic grasp planning rather than haptic applications. We aim to fill this gap with cHand, an extension of the haptics software library CHAI3D, which augments it with built-in hand kinematic visualization capabilities. A representation of the hand can be achieved with elementary geometric elements, or with custom geometries loaded from STL files. A live data visualization demo is included, which can be used as a template for other applications. We release cHand as an open source contribution to keep with the open source nature of CHAI3D. 
    more » « less
  4. null (Ed.)