Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Mixed reality (MR) interactions feature users interacting with a combination of virtual and physical components. Inspired by research investigating aspects associated with near-field interactions in augmented and virtual reality (AR & VR), we investigated how avatarization, the physicality of the interacting components, and the interaction technique used to manipulate a virtual object affected performance and perceptions of user experience in a mixed reality fundamentals of laparoscopic peg-transfer task wherein users had to transfer a virtual ring from one peg to another for a number of trials. We employed a 3 (Physicality of pegs) X 3 (Augmented Avatar Representation) X 2 (Interaction Technique) multi-factorial design, manipulating the physicality of the pegs as a between-subjects factor, the type of augmented self-avatar representation, and the type of interaction technique used for object-manipulation as within-subjects factors. Results indicated that users were significantly more accurate when the pegs were virtual rather than physical because of the increased salience of the task-relevant visual information. From an avatar perspective, providing users with a reach envelope-extending representation, though useful, was found to worsen performance, while co-located avatarization significantly improved performance. Choosing an interaction technique to manipulate objects depends on whether accuracy or efficiency is a priority. Finally, the relationship between the avatar representation and interaction technique dictates just how usable mixed reality interactions are deemed to be.more » « lessFree, publicly-accessible full text available May 1, 2025
-
Active exploration in virtual reality (VR) involves users navigating immersive virtual environments, going from one place to another. While navigating, users often engage in secondary tasks that require attentional resources, as in the case of distracted driving. Inspired by research generally studying the effects of task demands on cybersickness (CS), we investigated how the attentional demands specifically associated with secondary tasks performed during exploration affect CS. Downstream of this, we studied how increased attentional demands from secondary tasks affect spatial memory and navigational performance. We discuss the results of a multi-factorial between-subjects study, manipulating a secondary task's demand across two levels and studying its effects on CS in two different sickness-inducing levels of an exploration experience. The secondary task's demand was manipulated by parametrically varying n in an aural n-back working memory task and the provocativeness of the experience was manipulated by varying how frequently users experienced a yaw-rotational reorientation effect during the exploration. Results revealed that increases in the secondary task's demand increased sickness levels, also resulting in a higher temporal onset rate, especially when the experience was not already highly sickening. Increased attentional demand from the secondary task also vitiated navigational performance and spatial memory. Overall, increased demands from secondary tasks performed during navigation produce deleterious effects on the VR experience.more » « lessFree, publicly-accessible full text available May 1, 2025
-
A primary goal of the Virtual Reality ( VR ) community is to build fully immersive and presence-inducing environments with seamless and natural interactions. To reach this goal, researchers are investigating how to best directly use our hands to interact with a virtual environment using hand tracking. Most studies in this field require participants to perform repetitive tasks. In this article, we investigate if results of such studies translate into a real application and game-like experience. We designed a virtual escape room in which participants interact with various objects to gather clues and complete puzzles. In a between-subjects study, we examine the effects of two input modalities (controllers vs. hand tracking) and two grasping visualizations (continuously tracked hands vs. virtual hands that disappear when grasping) on ownership, realism, efficiency, enjoyment, and presence. Our results show that ownership, realism, enjoyment, and presence increased when using hand tracking compared to controllers. Visualizing the tracked hands during grasps leads to higher ratings in one of our ownership questions and one of our enjoyment questions compared to having the virtual hands disappear during grasps as is common in many applications. We also confirm some of the main results of two studies that have a repetitive design in a more realistic gaming scenario that might be closer to a typical user experience.more » « less
-
null (Ed.)In this work, we investigate the influence that audio and visual feedback have on a manipulation task in virtual reality (VR). Without the tactile feedback of a controller, grasping virtual objects using one’s hands can result in slower interactions because it may be unclear to the user that a grasp has occurred. Providing alternative feedback, such as visual or audio cues, may lead to faster and more precise interactions, but might also affect user preference and perceived ownership of the virtual hands. In this study, we test four feedback conditions for virtual grasping. Three of the conditions provide feedback for when a grasp or release occurs, either visual, audio, or both, and one provides no feedback for these occurrences. We analyze the effect each feedback condition has on interaction performance, measure their effect on the perceived ownership of the virtual hands, and gauge user preference. In an experiment, users perform a pick-and-place task with each feedback condition. We found that audio feedback for grasping is preferred over visual feedback even though it seems to decrease grasping performance, and found that there were little to no differences in ownership between our conditions.more » « less
-
In this work, we investigate the influence of different visualizations on a manipulation task in virtual reality (VR). Without the haptic feedback of the real world, grasping in VR might result in intersections with virtual objects. As people are highly sensitive when it comes to perceiving collisions, it might look more appealing to avoid intersections and visualize non-colliding hand motions. However, correcting the position of the hand or fingers results in a visual-proprioceptive discrepancy and must be used with caution. Furthermore, the lack of haptic feedback in the virtual world might result in slower actions as a user might not know exactly when a grasp has occurred. This reduced performance could be remediated with adequate visual feedback. In this study, we analyze the performance, level of ownership, and user preference of eight different visual feedback techniques for virtual grasping. Three techniques show the tracked hand (with or without grasping feedback), even if it intersects with the grasped object. Another three techniques display a hand without intersections with the object, called outer hand, simulating the look of a real world interaction. One visualization is a compromise between the two groups, showing both a primary outer hand and a secondary tracked hand. Finally, in the last visualization the hand disappears during the grasping activity. In an experiment, users perform a pick-and-place task for each feedback technique. We use high fidelity marker-based hand tracking to control the virtual hands in real time. We found that the tracked hand visualizations result in better performance, however, the outer hand visualizations were preferred. We also find indications that ownership is higher with the outer hand visualizations.more » « less