There is a lack of datasets for visual-inertial odometry applications in Extended Reality (XR). To the best of our knowledge, there is no dataset available that is captured from an XR headset with a human as a carrier. To bridge this gap, we present a novel pose estimation dataset --- called HoloSet --- collected using Microsoft Hololens 2, which is a state-of-the-art head mounted device for XR. Potential applications for HoloSet include visual-inertial odometry, simultaneous localization and mapping (SLAM), and additional applications in XR that leverage visual-inertial data. HoloSet captures both macro and micro movements. For macro movements, the dataset consists of more than 66,000 samples of visual, inertial, and depth camera data in a variety of environments (indoor, outdoor) and scene setups (trails, suburbs, downtown) under multiple user action scenarios (walk, jog). For micro movements, the dataset consists of more than 12,000 samples of additional articulated hand depth camera images while a user plays games that exercise fine motor skills and hand-eye coordination. We present basic visualizations and high-level statistics of the data and outline the potential research use cases for HoloSet.
more »
« less
This content will become publicly available on October 21, 2025
An XR GUI for Visualizing Messages in ECS Architectures
Entity–Component–System (ECS) architectures are fundamental to many systems for developing extended reality (XR) applications. These applications often contain complex scenes and require intricately connected application logic to connect components together, making debugging and analysis difficult. Graph-based tools have been created to show actions in ECS-based scene hierarchies, but few address interactions that go beyond traditional hierarchical communication. To address this, we present an XR GUI for Mercury (a toolkit to handle cross-component ECS communication) that allows developers to view and edit relationships and interactions between scene entities in Mercury.
more »
« less
- Award ID(s):
- 2037101
- PAR ID:
- 10563417
- Publisher / Repository:
- IEEE International Symposium on Mixed and Augmented Reality Adjunct 2024
- Date Published:
- ISBN:
- 979-8-3315-0691-9
- Page Range / eLocation ID:
- 640 to 641
- Subject(s) / Keyword(s):
- Visualization–Visualization systems and tools–Visualization toolkits Human–computer interaction (HCI)–Interaction paradigms–Virtual reality.
- Format(s):
- Medium: X
- Location:
- Bellevue, WA, USA
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Many have predicted the future of the Web to be the integration of Web content with the real-world through technologies such as Augmented Reality (AR). This has led to the rise of Extended Reality (XR) Web Browsers used to shorten the long AR application development and deployment cycle of native applications especially across different platforms. As XR Browsers mature, we face new challenges related to collaborative and multi-user applications that span users, devices, and machines. These collaborative XR applications require: (1) networking support for scaling to many users, (2) mechanisms for content access control and application isolation, and (3) the ability to host application logic near clients or data sources to reduce application latency. In this paper, we present the design and evaluation of the AR Edge Networking Architecture (ARENA) which is a platform that simplifies building and hosting collaborative XR applications on WebXR capable browsers. ARENA provides a number of critical components including: a hierarchical geospatial directory service that connects users to nearby servers and content, a token-based authentication system for controlling user access to content, and an application/service runtime supervisor that can dispatch programs across any network connected device. All of the content within ARENA exists as endpoints in a PubSub scene graph model that is synchronized across all users. We evaluate ARENA in terms of client performance as well as benchmark end-to-end response-time as load on the system scales. We show the ability to horizontally scale the system to Internet-scale with scenes containing hundreds of users and latencies on the order of tens of milliseconds. Finally, we highlight projects built using ARENA and showcase how our approach dramatically simplifies collaborative multi-user XR development compared to monolithic approaches.more » « less
-
In recent years, aiming to enhance and extend user experiences beyond the real world, Extended Reality (XR) has emerged to become a new paradigm that enables a plethora of applications [1], e.g., online gaming, online conferencing, social media, etc. XR refers to the human-machine interactions that combine real and virtual environments with the support of computing/communications technologies and wearable devices. The XR content is generated by providers or other users, including audio, video and other metadata. In general, the generated XR content is transmitted to XR devices and rendered into XR scenes (i.e., to generate an image from a 2D or 3D model by means of a computer program), where users can experience a hybrid experience of the real and virtual worlds.more » « less
-
This alternative format session provides a forum for human factors scholars and practitioners to showcase how state-of-the-art extended reality (XR) applications are being used in academia, defense, and industry to address human factors research. The session will begin with short introductions from each presenter to describe their XR application. Afterward, session attendees will engage with the presenters and their demonstrations, which will be set up around the demonstration floor room. This year’s showcase features XR applications in STEM education, medical and aviation training, agricultural data visualization, homeland security, training design, and visitor engagement in informal learning settings. Our goal is for attendees to experience how human factors professionals use XR to support human factors-oriented research and to learn about the exciting work being conducted with these emerging technologies.more » « less
-
null (Ed.)COVID underscores the potential of VR meeting tools to compensate for lack of embodied communication in applications like Zoom. But both research and commercial VR meeting environments typically seek to approximate physical meetings, instead of exploring new capacities of communication and coordination. We argue the most transformative features of VR (and XR more broadly) may look and feel very different from familiar social rituals of physical meetings. Embracing “weird” forms of sociality and embodiment, we incorporate inspiration from a range of sources including: (1) emerging rituals in commercial social VR, (2) existing research on social augmentation systems for meetings, (3) novel examples of embodied VR communication, and (4) a fictionalized vignette envisioning a future with aspects of “Weird Social XR” folded into everyday life. We call upon the research community to approach these speculative forms of alien sociality as opportunities to explore new kinds of social superpowers.more » « less