We have developed a series of course-based undergraduate research experiences for students integrated into course curriculum centered around the use of 3D visualization and virtual reality for science visualization. One project involves the creation and use of a volumetric renderer for hyperstack images, paired with a biology project in confocal microscopy. Students have worked to develop and test VR enabled tools for confocal microscopy visualization across headset based and CAVE based VR platforms. Two applications of the tool are presented: a rendering of Drosophila primordial germ cells coupled with automated detection and counting, and a database in development of 3D renderings of pollen grains. Another project involves the development and testing of point cloud renderers. Student work has focused on performance testing and enhancement across a range of 2D and 3D hardware, including native Quest apps. Through the process of developing these tools, students are introduced to scientific visualization concepts, while gaining practical experience with programming, software engineering, graphics, shader programming, and cross-platform design.
more »
« less
Hyperion: A 3D Visualization Platform for Optical Design of Folded Systems
Hyperion is a 3D visualization platform for optical design. It provides a fully immersive, intuitive, and interactive 3D user experience by leveraging existing AR/VR technologies. It enables the visualization of models of folded freeform optical systems in a dynamic 3D environment. The frontend user experience is supported by the computational ray-tracing engine of Eikonal+, an optical design research software currently being developed. We have built a cross-platform light-weight version of Eikonal+ that can communicate with any user interface or other scientific software. We have also demonstrated a prototype of the Hyperion 3D user experience using a Hololens AR display.
more »
« less
- Award ID(s):
- 1922591
- PAR ID:
- 10188273
- Date Published:
- Journal Name:
- Frameless Symposium
- Volume:
- 2
- Issue:
- 1
- Page Range / eLocation ID:
- Article 21
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm.more » « less
-
The emerging potential of augmented reality (AR) to improve 3D medical image visualization for diagnosis, by immersing the user into 3D morphology is further enhanced with the advent of wireless head-mounted displays (HMD). Such information-immersive capabilities may also enhance planning and visualization of interventional procedures. To this end, we introduce a computational platform to generate an augmented reality holographic scene that fuses pre-operative magnetic resonance imaging (MRI) sets, segmented anatomical structures, and an actuated model of an interventional robot for performing MRI-guided and robot-assisted interventions. The interface enables the operator to manipulate the presented images and rendered structures using voice and gestures, as well as to robot control. The software uses forbidden-region virtual fixtures that alerts the operator of collisions with vital structures. The platform was tested with a HoloLens HMD in silico. To address the limited computational power of the HMD, we deployed the platform on a desktop PC with two-way communication to the HMD. Operation studies demonstrated the functionality and underscored the importance of interface customization to fit a particular operator and/or procedure, as well as the need for on-site studies to assess its merit in the clinical realm. Index Terms—augmented reality, robot-assistance, imageguided interventions.more » « less
-
We present a mixed methods user study evaluating augmented reality (AR) as a visualization technique for use in astronomy journal publications. This work is motivated by the highly spatial nature of scientific visualizations employed in astronomy, including spatial reasoning tasks for hypothesis generation and scientific communications. In this 52-person user study, we evaluate two AR approaches (one traditional tabletop projection and the other with a ‘tangible’ aid) as spatial 3D visualization techniques, as compared to a baseline 3D rendering on a phone. We identify a significant difference in mental and physical workload between the two AR conditions in men and women. Qualitatively, through thematic coding of interviews, we identify notable observed differences ranging from device-specific physical challenges, to subdomain-specific utility within astronomy. The confluence of quantitative and qualitative results suggest a tension between workload and engagement when comparing non-AR and AR technologies. We summarize these findings and contribute them for reference in data visualization research furthering novel scientific communications in astronomy journal publications.more » « less
-
We present a mixed methods user study evaluating augmented reality (AR) as a visualization technique for use in astronomy journal publications. This work is motivated by the highly spatial nature of scientific visualizations employed in astronomy, including spatial reasoning tasks for hypothesis generation and scientific communications. In this 52-person user study, we evaluate two AR approaches (one traditional tabletop projection and the other with a 'tangible' aid) as spatial 3D visualization techniques, as compared to a baseline 3D rendering on a phone. We identify a significant difference in mental and physical workload between the two AR conditions in men and women. Qualitatively, through thematic coding of interviews, we identify notable observed differences ranging from device-specific physical challenges, to subdomain-specific utility within astronomy. The confluence of quantitative and qualitative results suggest a tension between workload and engagement when comparing non-AR and AR technologies. We summarize these findings and contribute them for reference in data visualization research furthering novel scientific communications in astronomy journal publications.more » « less
An official website of the United States government

