%AMorales Mojica, Cristina%ATsekos, Nikolaos%AVelazco-Garcia, Jose%AZhao, Haoran%ASeimenis, Ioannis%ALeiss, Ernst%AShah, Dipan%AWebb, Andrew%ABecker, Aaron%ATsiamyrtzis, Panagiotis%D2019%I %K %MOSTI ID: 10163101 %PMedium: X %TInteractive and Immersive Image-Guided Control of Interventional Manipulators with a Prototype Holographic Interface %XThe emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm. Country unknown/Code not availablehttps://doi.org/10.1109/BIBE.2019.00186OSTI-MSA