There is a lack of datasets for visual-inertial odometry applications in Extended Reality (XR). To the best of our knowledge, there is no dataset available that is captured from an XR headset with a human as a carrier. To bridge this gap, we present a novel pose estimation dataset --- called HoloSet --- collected using Microsoft Hololens 2, which is a state-of-the-art head mounted device for XR. Potential applications for HoloSet include visual-inertial odometry, simultaneous localization and mapping (SLAM), and additional applications in XR that leverage visual-inertial data. HoloSet captures both macro and micro movements. For macro movements, the dataset consists of more than 66,000 samples of visual, inertial, and depth camera data in a variety of environments (indoor, outdoor) and scene setups (trails, suburbs, downtown) under multiple user action scenarios (walk, jog). For micro movements, the dataset consists of more than 12,000 samples of additional articulated hand depth camera images while a user plays games that exercise fine motor skills and hand-eye coordination. We present basic visualizations and high-level statistics of the data and outline the potential research use cases for HoloSet.
more »
« less
Quantifying the Impact of XR Visual Guidance on User Performance Using a Large-Scale Virtual Assembly Experiment
The combination of Visual Guidance and Extended Reality (XR) technology holds the potential to greatly improve the performance of human workforces in numerous areas, particularly industrial environments. Focusing on virtual assembly tasks and making use of different forms of supportive visualisations, this study investigates the potential of XR Visual Guidance. Set in a web-based immersive environment, our results draw from a heterogeneous pool of 199 participants. This research is designed to significantly differ from previous exploratory studies, which yielded conflicting results on user performance and associated human factors. Our results clearly show the advantages of XR Visual Guidance based on an over 50% reduction in task completion times and mistakes made; this may further be enhanced and refined using specific frameworks and other forms of visualisations/Visual Guidance. Discussing the role of other factors, such as cognitive load, motivation, and usability, this paper also seeks to provide concrete avenues for future research and practical takeaways for practitioners.
more »
« less
- Award ID(s):
- 2107328
- PAR ID:
- 10435040
- Date Published:
- Journal Name:
- IEEE transactions on visualization and computer graphics
- ISSN:
- 1941-0506
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
This alternative format session provides a forum for human factors scholars and practitioners to showcase how state-of-the-art extended reality (XR) applications are being used in academia, defense, and industry to address human factors research. The session will begin with short introductions from each presenter to describe their XR application. Afterward, session attendees will engage with the presenters and their demonstrations, which will be set up around the demonstration floor room. This year’s showcase features XR applications in STEM education, medical and aviation training, agricultural data visualization, homeland security, training design, and visitor engagement in informal learning settings. Our goal is for attendees to experience how human factors professionals use XR to support human factors-oriented research and to learn about the exciting work being conducted with these emerging technologies.more » « less
-
Virtual reality (VR) simulations have been adopted to provide controllable environments for running augmented reality (AR) experiments in diverse scenarios. However, insufficient research has explored the impact of AR applications on users, especially their attention patterns, and whether VR simulations accurately replicate these effects. In this work, we propose to analyze user attention patterns via eye tracking during XR usage. To represent applications that provide both helpful guidance and irrelevant information, we built a Sudoku Helper app that includes visual hints and potential distractions during the puzzle-solving period. We conducted two user studies with 19 different users each in AR and VR, in which we collected eye tracking data, conducted gaze-based analysis, and trained machine learning (ML) models to predict user attentional states and attention control ability. Our results show that the AR app had a statistically significant impact on enhancing attention by increasing the fixated proportion of time, while the VR app reduced fixated time and made the users less focused. Results indicate that there is a discrepancy between VR simulations and the AR experience. Our ML models achieve 99.3% and 96.3% accuracy in predicting user attention control ability in AR and VR, respectively. A noticeable performance drop when transferring models trained on one medium to the other further highlights the gap between the AR experience and the VR simulation of it.more » « less
-
The Human Factors Extended Reality (XR) Showcase is an annual, interactive hands-on demonstration session of XR technologies and applications. Attendees can walk to different stations to experience the applications while the presenter explains. The 12 interactive demonstration stations highlight the integration of XR technologies and other technologies, including haptic devices and artificial intelligence, to enable human factor research and applications that span training, learning, research assessment, and simulations. Aligned with the mission of HFES, the purpose of the XR Showcase is to enable individuals to acquire knowledge about XR applications through interactive demonstrations, increase exposure of XR to the HFES community, support content visualization of interdisciplinary research, and create an exchange forum to support communication and collaboration.more » « less
-
With innovations in the field of gaze and eye tracking, a new concentration of research in the area of gaze-tracked systems and user interfaces has formed in the field of Extended Reality (XR). Eye trackers are being used to explore novel forms of spatial human–computer interaction, to understand human attention and behavior, and to test expectations and human responses. In this article, we review gaze interaction and eye tracking research related to XR that has been published since 1985, which includes a total of 215 publications. We outline efforts to apply eye gaze for direct interaction with virtual content and design of attentive interfaces that adapt the presented content based on eye gaze behavior and discuss how eye gaze has been utilized to improve collaboration in XR. We outline trends and novel directions and discuss representative high-impact papers in detail.more » « less
An official website of the United States government

