skip to main content


Title: “Interactive and Immersive Image-guided Control of Interventional Manipulators with a Prototype Holographic Interface”, IEEE International Conference on Bioinformatics and Bioengineering, October 28-30, 2019, Athens, GR
The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm. Index Terms—augmented reality, robot-assistance, imageguided interventions.  more » « less
Award ID(s):
1646566
NSF-PAR ID:
10130226
Author(s) / Creator(s):
; ; ; ; ; ; ; ;
Date Published:
Journal Name:
IEEE International Conference on Bioinformatics and Bioengineering (BIBE)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm. 
    more » « less
  2. Puig Puig, Anna and (Ed.)
    Motivated by the potential of holographic augmented reality (AR) to offer an immersive 3D appreciation of morphology and anatomy, the purpose of this work is to develop and assess an interface for image-based planning of prostate interventions with a head-mounted display (HMD). The computational system is a data and command pipeline that links a magnetic resonance imaging (MRI) scanner/data and the operator, that includes modules dedicated to image processing and segmentation, structure rendering, trajectory planning and spatial co-registration. The interface was developed with the Unity3D Engine (C#) and deployed and tested on a HoloLens HMD. For ergonomics in the surgical suite, the system was endowed with hands-free interactive manipulation of images and the holographic scene via hand gestures and voice commands. The system was tested in silico using MRI and ultrasound datasets of prostate phantoms. The holographic AR scene rendered by the HoloLens HMD was subjectively found superior to desktop-based volume or 3D rendering with regard to structure detection and appreciation of spatial relationships, planning access paths and manual co-registration of MRI and Ultrasound. By inspecting the virtual trajectory superimposed to rendered structures and MR images, the operator observes collisions of the needle path with vital structures (e.g. urethra) and adjusts accordingly. Holographic AR interfacing with wireless HMD endowed with hands-free gesture and voice control is a promising technology. Studies need to systematically assess the clinical merit of such systems and needed functionalities. 
    more » « less
  3. This work presents a platform that integrates a customized MRI data acquisition scheme with reconstruction and three-dimensional (3D) visualization modules along with a module for controlling an MRI-compatible robotic device to facilitate the performance of robot-assisted, MRI-guided interventional procedures. Using dynamically-acquired MRI data, the computational framework of the platform generates and updates a 3D model representing the area of the procedure (AoP). To image structures of interest in the AoP that do not reside inside the same or parallel slices, the MRI acquisition scheme was modified to collect a multi-slice set of intraoblique to each other slices; which are termed composing slices. Moreover, this approach interleaves the collection of the composing slices so the same k-space segments of all slices are collected during similar time instances. This time matching of the k-space segments results in spatial matching of the imaged objects in the individual composing slices. The composing slices were used to generate and update the 3D model of the AoP. The MRI acquisition scheme was evaluated with computer simulations and experimental studies. Computer simulations demonstrated that k-space segmentation and time-matched interleaved acquisition of these segments provide spatial matching of the structures imaged with composing slices. Experimental studies used the platform to image the maneuvering of an MRI-compatible manipulator that carried tubing filled with MRI contrast agent. In vivo experimental studies to image the abdomen and contrast enhanced heart on free-breathing subjects without cardiac triggering demonstrated spatial matching of imaged anatomies in the composing planes. The described interventional MRI framework could assist in performing real-time MRI-guided interventions. 
    more » « less
  4. Abstract Background

    User interfaces play a vital role in the planning and execution of an interventional procedure. The objective of this study is to investigate the effect of using different user interfaces for planning transrectal robot‐assisted MR‐guided prostate biopsy (MRgPBx) in an augmented reality (AR) environment.

    Method

    End‐user studies were conducted by simulating an MRgPBx system with end‐ and side‐firing modes. The information from the system to the operator was rendered on HoloLens as an output interface. Joystick, mouse/keyboard, and holographic menus were used as input interfaces to the system.

    Results

    The studies indicated that using a joystick improved the interactive capacity and enabled operator to plan MRgPBx in less time. It efficiently captures the operator's commands to manipulate the augmented environment representing the state of MRgPBx system.

    Conclusions

    The study demonstrates an alternative to conventional input interfaces to interact and manipulate an AR environment within the context of MRgPBx planning.

     
    more » « less
  5. null (Ed.)
    Ani-Bot is a modular robotics system that allows users to control their DIY robots using Mixed-Reality Interaction (MRI). This system takes advantage of MRI to enable users to visually program the robot through the augmented view of a Head-Mounted Display (HMD). In this paper, we first explain the design of the Mixed-Reality (MR) ready modular robotics system, which allows users to instantly perform MRI once they finish assembling the robot. Then, we elaborate the augmentations provided by the MR system in the three primary phases of a construction kit's lifecycle: Creation, Tweaking, and Usage. Finally, we demonstrate Ani-Bot with four application examples and evaluate the system with a two-session user study. The results of our evaluation indicate that Ani-Bot does successfully embed MRI into the lifecycle (Creation, Tweaking, Usage) of DIY robotics and that it does show strong potential for delivering an enhanced user experience. 
    more » « less