skip to main content


Title: Evaluation of Interventional Planning Software Features for MR-guided Transrectal Prostate Biopsies
This work presents an interventional planning software to be used in conjunction with a robotic manipulator to perform transrectal MR guided prostate biopsies. The interventional software was designed taking in consideration a generic manipulator used under the two modes of operation: side-firing and end-firing of the biopsy needle. Studies were conducted with urologists using the software to plan virtual biopsies. The results show features of software relevant for operating efficiently under the two modes of operation.  more » « less
Award ID(s):
1646566
NSF-PAR ID:
10223621
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ;
Date Published:
Journal Name:
IEEE 20th International Conference on Bioinformatics and Bioengineering (BIBE)
Page Range / eLocation ID:
951 to 954
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Background

    This study presents user evaluation studies to assess the effect of information rendered by an interventional planning software on the operator's ability to plan transrectal magnetic resonance (MR)‐guided prostate biopsies using actuated robotic manipulators.

    Methods

    An intervention planning software was developed based on the clinical workflow followed for MR‐guided transrectal prostate biopsies. The software was designed to interface with a generic virtual manipulator and simulate an intervention environment using 2D and 3D scenes. User studies were conducted with urologists using the developed software to plan virtual biopsies.

    Results

    User studies demonstrated that urologists with prior experience in using 3D software completed the planning less time. 3D scenes were required to control all degrees‐of‐freedom of the manipulator, while 2D scenes were sufficient for planar motion of the manipulator.

    Conclusions

    The study provides insights on using 2D versus 3D environment from a urologist's perspective for different operational modes of MR‐guided prostate biopsy systems.

     
    more » « less
  2. Prostate biopsy is considered as a definitive way for diagnosing prostate malignancies. Urologists are currently moving towards MR-guided prostate biopsies over conventional transrectal ultrasound-guided biopsies for prostate cancer detection. Recently, robotic systems have started to emerge as an assistance tool for urologists to perform MR-guided prostate biopsies. However, these robotic assistance systems are designed for a specific clinical environment and cannot be adapted to modifications or changes applied to the clinical setting and/or workflow. This work presents the preliminary design of a cable-driven manipulator developed to be used in both MR scanners and MR-ultrasound fusion systems. The proposed manipulator design and functionality are evaluated on a simulated virtual environment. The simulation is created on an in-house developed interventional planning software to evaluate the ergonomics and usability. The results show that urologists can benefit from the proposed design of the manipulator and planning software to accurately perform biopsies of targeted areas in the prostate. 
    more » « less
  3. Abstract Background

    User interfaces play a vital role in the planning and execution of an interventional procedure. The objective of this study is to investigate the effect of using different user interfaces for planning transrectal robot‐assisted MR‐guided prostate biopsy (MRgPBx) in an augmented reality (AR) environment.

    Method

    End‐user studies were conducted by simulating an MRgPBx system with end‐ and side‐firing modes. The information from the system to the operator was rendered on HoloLens as an output interface. Joystick, mouse/keyboard, and holographic menus were used as input interfaces to the system.

    Results

    The studies indicated that using a joystick improved the interactive capacity and enabled operator to plan MRgPBx in less time. It efficiently captures the operator's commands to manipulate the augmented environment representing the state of MRgPBx system.

    Conclusions

    The study demonstrates an alternative to conventional input interfaces to interact and manipulate an AR environment within the context of MRgPBx planning.

     
    more » « less
  4. The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm. 
    more » « less
  5. The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm. Index Terms—augmented reality, robot-assistance, imageguided interventions. 
    more » « less