skip to main content


Title: A Prototype Holographic Augmented Reality Interface for Image-Guided Prostate Cancer Interventions
Motivated by the potential of holographic augmented reality (AR) to offer an immersive 3D appreciation of morphology and anatomy, the purpose of this work is to develop and assess an interface for image-based planning of prostate interventions with a head-mounted display (HMD). The computational system is a data and command pipeline that links a magnetic resonance imaging (MRI) scanner/data and the operator, that includes modules dedicated to image processing and segmentation, structure rendering, trajectory planning and spatial co-registration. The interface was developed with the Unity3D Engine (C#) and deployed and tested on a HoloLens HMD. For ergonomics in the surgical suite, the system was endowed with hands-free interactive manipulation of images and the holographic scene via hand gestures and voice commands. The system was tested in silico using MRI and ultrasound datasets of prostate phantoms. The holographic AR scene rendered by the HoloLens HMD was subjectively found superior to desktop-based volume or 3D rendering with regard to structure detection and appreciation of spatial relationships, planning access paths and manual co-registration of MRI and Ultrasound. By inspecting the virtual trajectory superimposed to rendered structures and MR images, the operator observes collisions of the needle path with vital structures (e.g. urethra) and adjusts accordingly. Holographic AR interfacing with wireless HMD endowed with hands-free gesture and voice control is a promising technology. Studies need to systematically assess the clinical merit of such systems and needed functionalities.  more » « less
Award ID(s):
1646566
NSF-PAR ID:
10223690
Author(s) / Creator(s):
Editor(s):
Puig Puig, Anna and
Date Published:
Journal Name:
Eurographics Workshop on Visual Computing for Biomedicine
ISSN:
2070-5786
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm. 
    more » « less
  2. The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm. Index Terms—augmented reality, robot-assistance, imageguided interventions. 
    more » « less
  3. Abstract Background

    User interfaces play a vital role in the planning and execution of an interventional procedure. The objective of this study is to investigate the effect of using different user interfaces for planning transrectal robot‐assisted MR‐guided prostate biopsy (MRgPBx) in an augmented reality (AR) environment.

    Method

    End‐user studies were conducted by simulating an MRgPBx system with end‐ and side‐firing modes. The information from the system to the operator was rendered on HoloLens as an output interface. Joystick, mouse/keyboard, and holographic menus were used as input interfaces to the system.

    Results

    The studies indicated that using a joystick improved the interactive capacity and enabled operator to plan MRgPBx in less time. It efficiently captures the operator's commands to manipulate the augmented environment representing the state of MRgPBx system.

    Conclusions

    The study demonstrates an alternative to conventional input interfaces to interact and manipulate an AR environment within the context of MRgPBx planning.

     
    more » « less
  4. Simultaneous visualization of the teeth and periodontium is of significant clinical interest for image-based monitoring of periodontal health. We recently reported the application of a dual-modality photoacoustic-ultrasound (PA-US) imaging system for resolving periodontal anatomy and periodontal pocket depths in humans. This work utilized a linear array transducer attached to a stepper motor to generate 3D images via maximum intensity projection. This prior work also used a medical head immobilizer to reduce artifacts during volume rendering caused by motion from the subject (e.g., breathing, minor head movements). However, this solution does not completely eliminate motion artifacts while also complicating the imaging procedure and causing patient discomfort. To address this issue, we report the implementation of an image registration technique to correctly align B-mode PA-US images and generate artifact-free 2D cross-sections. Application of the deshaking technique to PA phantoms revealed 80% similarity to the ground truth when shaking was intentionally applied during stepper motor scans. Images from handheld sweeps could also be deshaken using an LED PA-US scanner. Inex vivoporcine mandibles, pigmentation of the enamel was well-estimated within 0.1 mm error. The pocket depth measured in a healthy human subject was also in good agreement with our prior study. This report demonstrates that a modality-independent registration technique can be applied to clinically relevant PA-US scans of the periodontium to reduce operator burden of skill and subject discomfort while showing potential for handheld clinical periodontal imaging.

     
    more » « less
  5. In many complex tasks, a remote expert may need to assist a local user or to guide his or her actions in the local user's environment. Existing solutions also allow multiple users to collaborate remotely using high-end Augmented Reality (AR) and Virtual Reality (VR) head-mounted displays (HMD). In this paper, we propose a portable remote collaboration approach, with the integration of AR and VR devices, both running on mobile platforms, to tackle the challenges of existing approaches. The AR mobile platform processes the live video and measures the 3D geometry of the local environment of a local user. The 3D scene is then transited and rendered in the remote side on a mobile VR device, along with a simple and effective user interface, which allows a remote expert to easily manipulate the 3D scene on the VR platform and to guide the local user to complete tasks in the local environment. 
    more » « less