- Publication Date:
- NSF-PAR ID:
- Journal Name:
- ICAT-EGVE 2019 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments
- Sponsoring Org:
- National Science Foundation
More Like this
The objective of this research is to compare the effectiveness of different tracking devices underwater. There have been few works in aquatic virtual reality (VR) - i.e., VR systems that can be used in a real underwater environment. Moreover, the works that have been done have noted limitations on tracking accuracy. Our initial test results suggest that inertial measurement units work well underwater for orientation tracking but a different approach is needed for position tracking. Towards this goal, we have waterproofed and evaluated several consumer tracking systems intended for gaming to determine the most effective approaches. First, we informally tested infrared systems and fiducial marker based systems, which demonstrated significant limitations of optical approaches. Next, we quantitatively compared inertial measurement units (IMU) and a magnetic tracking system both above water (as a baseline) and underwater. By comparing the devices rotation data, we have discovered that the magnetic tracking system implemented by the Razer Hydra is more accurate underwater as compared to a phone-based IMU. This suggests that magnetic tracking systems should be further explored for underwater VR applications.
Underwater robots, including Remote Operating Vehicles (ROV) and Autonomous Underwater Vehicles (AUV), are currently used to support underwater missions that are either impossible or too risky to be performed by manned systems. In recent years the academia and robotic industry have paved paths for tackling technical challenges for ROV/AUV operations. The level of intelligence of ROV/AUV has increased dramatically because of the recent advances in low-power-consumption embedded computing devices and machine intelligence (e.g., AI). Nonetheless, operating precisely underwater is still extremely challenging to minimize human intervention due to the inherent challenges and uncertainties associated with the underwater environments. Proximity operations, especially those requiring precise manipulation, are still carried out by ROV systems that are fully controlled by a human pilot. A workplace-ready and worker-friendly ROV interface that properly simplifies operator control and increases remote operation confidence is the central challenge for the wide adaptation of ROVs.
This paper examines the recent advances of virtual telepresence technologies as a solution for lowering the barriers to the human-in-the-loop ROV teleoperation. Virtual telepresence refers to Virtual Reality (VR) related technologies that help a user to feel that they were in a hazardous situation without being present at the actual location. We present amore »
A mixed reality system combining augmented reality, 3D bio-printed physical environments and inertial measurement unit sensors for task planning
Successful surgical operations are characterized by preplanning routines to be executed during actual surgical operations. To achieve this, surgeons rely on the experience acquired from the use of cadavers, enabling technologies like virtual reality (VR) and clinical years of practice. However, cadavers, having no dynamism and realism as they lack blood, can exhibit limited tissue degradation and shrinkage, while current VR systems do not provide amplified haptic feedback. This can impact surgical training increasing the likelihood of medical errors. This work proposes a novel Mixed Reality Combination System (MRCS) that pairs Augmented Reality (AR) technology and an inertial measurement unit (IMU) sensor with 3D printed, collagen-based specimens that can enhance task performance like planning and execution. To achieve this, the MRCS charts out a path prior to a user task execution based on a visual, physical, and dynamic environment on the state of a target object by utilizing surgeon-created virtual imagery that, when projected onto a 3D printed biospecimen as AR, reacts visually to user input on its actual physical state. This allows a real-time user reaction of the MRCS by displaying new multi-sensory virtual states of an object prior to performing on the actual physical state of thatmore »
Adaptive bias and attitude observer on the special orthogonal group for true-north gyrocompass systems: Theory and preliminary resultsThis article reports an adaptive sensor bias observer and attitude observer operating directly on [Formula: see text] for true-north gyrocompass systems that utilize six-degree-of-freedom inertial measurement units (IMUs) with three-axis accelerometers and three-axis angular rate gyroscopes (without magnetometers). Most present-day low-cost robotic vehicles employ attitude estimation systems that employ microelectromechanical system (MEMS) magnetometers, angular rate gyros, and accelerometers to estimate magnetic attitude (roll, pitch, and magnetic heading) with limited heading accuracy. Present-day MEMS gyros are not sensitive enough to dynamically detect the Earth’s rotation, and thus cannot be used to estimate true-north geodetic heading. Relying on magnetic compasses can be problematic for vehicles that operate in environments with magnetic anomalies and those requiring high-accuracy navigation as the limited accuracy ([Formula: see text] error) of magnetic compasses is typically the largest error source in underwater vehicle navigation systems. Moreover, magnetic compasses need to undergo time-consuming recalibration for hard-iron and soft-iron errors every time a vehicle is reconfigured with a new instrument or other payload, as very frequently occurs on oceanographic marine vehicles. In contrast, the gyrocompass system reported herein utilizes fiber optic gyroscope (FOG) IMU angular rate gyro and MEMS accelerometer measurements (without magnetometers) to dynamically estimate the instrument’s time-varying true-northmore »
The objective of this research was to evaluate and compare perceived fatigue and usability of 3D user interfaces in and out of the water. Virtual Reality (VR) in the water has several potential applications, such as aquatic physical rehabilitation, where patients are typically standing waist or shoulder deep in a pool and performing exercises in the water. However, there have been few works that developed waterproof VR/AR systems and none of them have assessed fatigue, which has previously been shown to be a drawback in many 3D User Interfaces above water. This research presents a novel prototype system for developing waterproof VR experiences and investigates the effect of submersion in water on fatigue as compared to above water. Using a classic selection and docking task, results suggest that being underwater had no significant effect on performance, but did reduce perceived fatigue, which is important for aquatic rehabilitation. Previous 3D interaction methods that were once thought to be too fatiguing might still be viable in water.