skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: A Virtual Reality Interface for Interactions with Spatiotemporal 3D Data
Traditional interfaces for interacting with 3D models in virtual environments lack support for spatiotemporal 3D models such as point clouds and meshes generated by markerless capture systems. We present a virtual reality (VR) interface that enables the user to perform spatial and temporal interactions with spatiotemporal 3D models. To accommodate the high volume of spatiotemporal data, we provide a data format for spatiotemporal 3D models which has an average speedup of 3.84 and a space reduction of 43.9% over traditional model file formats. We enable the user to manipulate spatiotemporal 3D data using gestures intuitive from real-world experience or by using a VR user interface similar to traditional 2D visual interactions.  more » « less
Award ID(s):
1730183
PAR ID:
10056163
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Multimedia Modeling (MMM) 2018
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Recent innovations in virtual and mixed-reality (VR/MR) technologies have enabled innovative hands-on training applications in high-risk/high-value fields such as medicine, flight, and worker-safety. Here, we present a detailed description of a novel VR/MR tactile user interactions/interface (TUI) hardware and software development framework that enables the rapid and cost-effective no-code development, optimization, and distribution of fully authentic hands-on VR/MR laboratory training experiences in the physical and life sciences. We applied our framework to the development and optimization of an introductory pipette calibration activity that is often carried out in real chemistry and biochemistry labs. Our approach provides users with nuanced real-time feedback on both their psychomotor skills during data acquisition and their attention to detail when conducting data analysis procedures. The cost-effectiveness of our approach relative to traditional face-to-face science labs improves access to quality hands-on science lab experiences. Importantly, the no-code nature of this Hands-On Virtual-Reality (HOVR) Lab platform enables faculties to iteratively optimize VR/MR experiences to meet their student’s targeted needs without costly software development cycles. Our platform also accommodates TUIs using either standard virtual-reality controllers (VR TUI mode) or fully functional hand-held physical lab tools (MR TUI mode). In the latter case, physical lab tools are strategically retrofitted with optical tracking markers to enable tactile, experimental, and analytical authenticity scientific experimentation. Preliminary user study data highlights the strengths and weaknesses of our generalized approach regarding student affective and cognitive student learning outcomes. 
    more » « less
  2. In many complex tasks, a remote expert may need to assist a local user or to guide his or her actions in the local user's environment. Existing solutions also allow multiple users to collaborate remotely using high-end Augmented Reality (AR) and Virtual Reality (VR) head-mounted displays (HMD). In this paper, we propose a portable remote collaboration approach, with the integration of AR and VR devices, both running on mobile platforms, to tackle the challenges of existing approaches. The AR mobile platform processes the live video and measures the 3D geometry of the local environment of a local user. The 3D scene is then transited and rendered in the remote side on a mobile VR device, along with a simple and effective user interface, which allows a remote expert to easily manipulate the 3D scene on the VR platform and to guide the local user to complete tasks in the local environment. 
    more » « less
  3. Hyperion is a 3D visualization platform for optical design. It provides a fully immersive, intuitive, and interactive 3D user experience by leveraging existing AR/VR technologies. It enables the visualization of models of folded freeform optical systems in a dynamic 3D environment. The frontend user experience is supported by the computational ray-tracing engine of Eikonal+, an optical design research software currently being developed. We have built a cross-platform light-weight version of Eikonal+ that can communicate with any user interface or other scientific software. We have also demonstrated a prototype of the Hyperion 3D user experience using a Hololens AR display. 
    more » « less
  4. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less
  5. Virtual Reality (VR) headsets with embedded eye trackers are appearing as consumer devices (e.g. HTC Vive Eye, FOVE). These devices could be used in VR-based education (e.g., a virtual lab, a virtual field trip) in which a live teacher guides a group of students. The eye tracking could enable better insights into students’ activities and behavior patterns. For real-time insight, a teacher’s VR environment can display student eye gaze. These visualizations would help identify students who are confused/distracted, and the teacher could better guide them to focus on important objects. We present six gaze visualization techniques for a VR-embedded teacher’s view, and we present a user study to compare these techniques. The results suggest that a short particle trail representing eye trajectory is promising. In contrast, 3D heatmaps (an adaptation of traditional 2D heatmaps) for visualizing gaze over a short time span are problematic. 
    more » « less