- NSF-PAR ID:
- 10315356
- Date Published:
- Journal Name:
- IEEE XR for Healthcare and Wellbeing Workshop
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
External ventricular drain (EVD) is a common, yet challenging neurosurgical procedure of placing a catheter into the brain ventricular system that requires prolonged training for surgeons to improve the catheter placement accuracy. In this paper, we introduce NeuroLens, an Augmented Reality (AR) system that provides neurosurgeons with guidance that aides them in completing an EVD catheter placement. NeuroLens builds on prior work in AR-assisted EVD to present a registered hologram of a patient’s ventricles to the surgeons, and uniquely incorporates guidance on the EVD catheter’s trajectory, angle of insertion, and distance to the target. The guidance is enabled by tracking the EVD catheter. We evaluate NeuroLens via a study with 33 medical students, in which we analyzed students’ EVD catheter insertion accuracy and completion time, eye gaze patterns, and qualitative responses. Our study, in which NeuroLens was used to aid students in inserting an EVD catheter into a realistic phantom model of a human head, demonstrated the potential of NeuroLens as a tool that will aid and educate novice neurosurgeons. On average, the use of NeuroLens improved the EVD placement accuracy of year 1 students by 39.4% and of the year 2−4 students by 45.7%. Furthermore, students who focused more on NeuroLens-provided contextual guidance achieved better results.more » « less
-
Cardiac interventional procedures are often performed under fluoroscopic guidance, exposing both the patient and operators to ionizing radiation. To reduce this risk of radiation exposure, we are exploring the use of photoacoustic imaging paired with robotic visual servoing for cardiac catheter visualization and surgical guidance. A cardiac catheterization procedure was performed on two in vivo swine after inserting an optical fiber into the cardiac catheter to produce photoacoustic signals from the tip of the fiber-catheter pair. A combination of photoacoustic imaging and robotic visual servoing was employed to visualize and maintain constant sight of the catheter tip in order to guide the catheter through the femoral or jugular vein, toward the heart. Fluoroscopy provided initial ground truth estimates for 1D validation of the catheter tip positions, and these estimates were refined using a 3D electromagnetic-based cardiac mapping system as the ground truth. The 1D and 3D root mean square errors ranged 0.25-2.28 mm and 1.24-1.54 mm, respectively. The catheter tip was additionally visualized at three locations within the heart: (1) inside the right atrium, (2) in contact with the right ventricular outflow tract, and (3) inside the right ventricle. Lasered regions of cardiac tissue were resected for histopathological analysis, which revealed no laser-related tissue damage, despite the use of 2.98 mJ per pulse at the fiber tip (379.2 mJ/cm2 fluence). In addition, there was a 19 dB difference in photoacoustic signal contrast when visualizing the catheter tip pre-and post-endocardial tissue contact, which is promising for contact confirmation during cardiac interventional procedures (e.g., cardiac radiofrequency ablation). These results are additionally promising for the use of photoacoustic imaging to guide cardiac interventions by providing depth information and enhanced visualization of catheter tip locations within blood vessels and within the beating heart.more » « less
-
Abstract Successful surgical operations are characterized by preplanning routines to be executed during actual surgical operations. To achieve this, surgeons rely on the experience acquired from the use of cadavers, enabling technologies like virtual reality (VR) and clinical years of practice. However, cadavers, having no dynamism and realism as they lack blood, can exhibit limited tissue degradation and shrinkage, while current VR systems do not provide amplified haptic feedback. This can impact surgical training increasing the likelihood of medical errors. This work proposes a novel Mixed Reality Combination System (MRCS) that pairs Augmented Reality (AR) technology and an inertial measurement unit (IMU) sensor with 3D printed, collagen-based specimens that can enhance task performance like planning and execution. To achieve this, the MRCS charts out a path prior to a user task execution based on a visual, physical, and dynamic environment on the state of a target object by utilizing surgeon-created virtual imagery that, when projected onto a 3D printed biospecimen as AR, reacts visually to user input on its actual physical state. This allows a real-time user reaction of the MRCS by displaying new multi-sensory virtual states of an object prior to performing on the actual physical state of that same object enabling effective task planning. Tracked user actions using an integrated 9-Degree of Freedom IMU demonstrate task execution This demonstrates that a user, with limited knowledge of specific anatomy, can, under guidance, execute a preplanned task. In addition, to surgical planning, this system can be generally applied in areas such as construction, maintenance, and education.
-
Abstract Background In childhood and adolescence, cardiac arrhythmias are often benign in the absence of congenital heart defects. Nevertheless, life-threatening inherited arrhythmogenic syndromes can become clinically manifest in early childhood. As early symptoms may be similar in both conditions, thorough workup is fundamental to avoid delayed diagnosis and misdiagnosis.
Case summary We present the case of a 26-year-old Caucasian female patient who presented with recurrent non-sustained polymorphic wide complex tachycardia. Structural heart disease was excluded by echocardiography as well as cardiac magnetic resonance imaging. Due to wide complex extrasystoles and couplets with alternating QRS axis occurring at low levels of physical exertion, catecholaminergic polymorphic ventricular tachycardia (CPVT) was suspected and further investigated. Epinephrine testing in combination with an electrophysiological (EP) study with placement of a coronary sinus catheter and subsequent programmed stimulation ruled out CPVT and unmasked wide complex tachycardia as varying aberrant conduction of focal atrial tachycardia (FAT). 3D-navigated mapping of FAT revealed a direct parahisian origin. Due to significantly increased risk of atrio-ventricular (AV) block during ablation, the patient refused ablation and preferred medical antiarrhythmic therapy.
Discussion Given the consequences of both, delayed diagnosis and misdiagnosis of CPVT, thorough workup is fundamental. In case of doubt regarding potential aberrant AV conduction in the context of wide complex tachycardia, an invasive EP study may easily and safely prove or rule out aberrancy.
-
Photoacoustic imaging–the combination of optics and acoustics to visualize differences in optical absorption – has recently demonstrated strong viability as a promising method to provide critical guidance of multiple surgeries and procedures. Benefits include its potential to assist with tumor resection, identify hemorrhaged and ablated tissue, visualize metal implants (e.g., needle tips, tool tips, brachytherapy seeds), track catheter tips, and avoid accidental injury to critical subsurface anatomy (e.g., major vessels and nerves hidden by tissue during surgery). These benefits are significant because they reduce surgical error, associated surgery-related complications (e.g., cancer recurrence, paralysis, excessive bleeding), and accidental patient death in the operating room. This invited review covers multiple aspects of the use of photoacoustic imaging to guide both surgical and related non-surgical interventions. Applicable organ systems span structures within the head to contents of the toes, with an eye toward surgical and interventional translation for the benefit of patients and for use in operating rooms and interventional suites worldwide. We additionally include a critical discussion of complete systems and tools needed to maximize the success of surgical and interventional applications of photoacoustic-based technology, spanning light delivery, acoustic detection, and robotic methods. Multiple enabling hardware and software integration components are also discussed, concluding with a summary and future outlook based on the current state of technological developments, recent achievements, and possible new directions.