skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on April 25, 2026

Title: A design toolkit for task support with mixed reality and artificial intelligence
Efficient performance and acquisition of physical skills, from sports techniques to surgical procedures, require instruction and feedback. In the absence of a human expert, Mixed Reality Intelligent Task Support (MixITS) can offer a promising alternative. These systems integrate Artificial Intelligence (AI) and Mixed Reality (MR) to provide realtime feedback and instruction as users practice and learn skills using physical tools and objects. However, designing MixITS systems presents challenges beyond engineering complexities. The complex interactions between users, AI, MR interfaces, and the physical environment create unique design obstacles. To address these challenges, we present MixITS-Kit—an interaction design toolkit derived from our analysis of MixITS prototypes developed by eight student teams during a 10-week-long graduate course. Our toolkit comprises design considerations, design patterns, and an interaction canvas. Our evaluation suggests that the toolkit can serve as a valuable resource for novice practitioners designing MixITS systems and researchers developing new tools for human-AI interaction design.  more » « less
Award ID(s):
2240133
PAR ID:
10617345
Author(s) / Creator(s):
; ;
Publisher / Repository:
Frontiers in Virtual Reality
Date Published:
Journal Name:
Frontiers in Virtual Reality
Volume:
6
ISSN:
2673-4192
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Research has identified applications of handheld-based VR, which utilizes handheld displays or mobile devices, for developing systems that involve users in mixed reality (MR) without the need for head-worn displays (HWDs). Such systems can potentially accommodate large groups of users participating in MR. However, we lack an understanding of how group sizes and interaction methods affect the user experience. In this paper, we aim to advance our understanding of handheld-based MR in the context of multiplayer, co-located games. We conducted a study (N = 38) to understand how user experiences vary by group size (2, 4, and 8) and interaction method (proximity-based or pointing-based). For our experiment, we implemented a multiuser experience for up to ten users. We found that proximity-based interaction that encouraged dynamic movement positively affected social presence and physical/temporal workload. In bigger group settings, participants felt less challenged and less positive. Individuals had varying preferences for group size and interaction type. The findings of the study will advance our understanding of the design space for handheld-based MR in terms of group sizes and interaction schemes. To make our contributions explicit, we conclude our paper with design implications that can inform user experience design in handheld-based mixed reality contexts. 
    more » « less
  2. Recent innovations in virtual and mixed-reality (VR/MR) technologies have enabled innovative hands-on training applications in high-risk/high-value fields such as medicine, flight, and worker-safety. Here, we present a detailed description of a novel VR/MR tactile user interactions/interface (TUI) hardware and software development framework that enables the rapid and cost-effective no-code development, optimization, and distribution of fully authentic hands-on VR/MR laboratory training experiences in the physical and life sciences. We applied our framework to the development and optimization of an introductory pipette calibration activity that is often carried out in real chemistry and biochemistry labs. Our approach provides users with nuanced real-time feedback on both their psychomotor skills during data acquisition and their attention to detail when conducting data analysis procedures. The cost-effectiveness of our approach relative to traditional face-to-face science labs improves access to quality hands-on science lab experiences. Importantly, the no-code nature of this Hands-On Virtual-Reality (HOVR) Lab platform enables faculties to iteratively optimize VR/MR experiences to meet their student’s targeted needs without costly software development cycles. Our platform also accommodates TUIs using either standard virtual-reality controllers (VR TUI mode) or fully functional hand-held physical lab tools (MR TUI mode). In the latter case, physical lab tools are strategically retrofitted with optical tracking markers to enable tactile, experimental, and analytical authenticity scientific experimentation. Preliminary user study data highlights the strengths and weaknesses of our generalized approach regarding student affective and cognitive student learning outcomes. 
    more » « less
  3. Abstract Fluid power systems can be expensive and difficult to access, making it challenging to provide hands-on training. This work discusses the incorporation of Mixed Reality (MR) technology in Fluid Power applications for providing a virtual training environment that simulates the behavior of fluid power systems, allowing users to receive immediate feedback on the system’s performance. Mixed reality is a digitized-based technology that integrates a virtual environment with our real world by utilizing real-world sensor data and computer models. This technology allows running simulations that examine the complexity of highly-coupled systems, producing new digital environments where physical and digital elements can interact in real-time. With all these features, MR technology can be a practical training tool for running virtual simulations that mimic real-life industry settings. It can extend the user with a virtual training environment, thus preparing the next generation of fluid power engineers and specialists. Throughout this work, we present the development and capabilities of a digitized virtual copy of a hydraulic excavator’s arm in an MR environment as a proof of concept. The MR arm module is developed and deployed using Microsoft’s Mixed Reality Tool Kit (MRTK) for Unity through HoloLens 2 MR headset. The MR development involves generating virtual copies of the mechanical and hydraulic subsystems, conducting the virtual assembly, and creating a user interface in the MR environment to visualize and interact with the model. The developed MR module enables visualizing the excavator’s internal structure, conducting the virtual assembly, and running virtual simulations, all of which assist in training future fluid power operators. It is an effective training tool that helps train junior engineers/technicians, cutting down on cost and time. 
    more » « less
  4. Digital hydraulics is a discrete technology that integrates advanced dynamic system controls, digital electronics, and machine learning to enhance fluid power systems’ performance, overall efficiency, and controllability. A mechanically actuated inline three-piston variable displacement digital pump was previously proposed and designed. The inline three-piston pump incorporates complex mechanical and hydraulic subsystems and highly coupled mechanisms. The complexity of the utilized subsystems poses challenges when assessing the viability of the conceptual design. Therefore, this work focuses on designing, developing, and implementing a collaborative virtual platform involving a digitized module showcasing the internal mechanical structure of the digital pump utilizing mixed reality (MR) technology. MR technology is acknowledged as the forthcoming evolution of the human–machine interface in the real–virtual environment utilizing computers and wearables. This technology permits running simulations that examine the complexity of highly coupled systems, like the digital pump, where understanding the physical phenomenon is far too intricate. The developed MR platform permits multiple users to collaborate in a synchronized immersive MR environment to study and analyze the applicability of the pump’s design and the adequacy of the operated mechanisms. The collaborative MR platform was designed and developed on the Unity game engine, employing Microsoft Azure and Photon Unity Networking to set up the synchronized MR environment. The platform involves a fully interactive virtual module on the digital pump design, developed in multiple stages using Microsoft’s Mixed Reality Tool Kit (MRTK) for Unity and deployed in the synchronized MR environment through a HoloLens 2 MR headset. A research study involving 71 participants was carried out at Purdue University. The study’s objective was to explore the impact of the collaborative MR environment on understanding the complexity and operation of the digital pump. It also sought to assess the effectiveness of MR in facilitating collaboration among fluid power stakeholders in a synchronized digital reality setting to study, diagnose, and control their complex systems. Surveys were designed and completed by all 71 participants after experiencing the MR platform. The results indicate that approximately 75% of the participants expressed positive attitudes toward their overall MR platform experience, with particular appreciation for its immersive nature and the synchronized collaborative environment it provided. More than 70% of the participants agreed that the pump’s collaborative MR platform was essential for studying and understanding the complexity and intricacy of the digital pump’s mechanical structure. Overall, the results demonstrate that the MR platform effectively facilitates the visualization of the complex pump’s internal structure, inspection of the assembly of each of the involved subsystems, and testing the applicability of the complicated mechanisms. 
    more » « less
  5. Abstract Recent immersive mixed reality (MR) and virtual reality (VR) displays enable users to use their hands to interact with both veridical and virtual environments simultaneously. Therefore, it becomes important to understand the performance of human hand-reaching movement in MR. Studies have shown that different virtual environment visualization modalities can affect point-to-point reaching performance using a stylus, but it is not yet known if these effects translate to direct human-hand interactions in mixed reality. This paper focuses on evaluating human point-to-point motor performance in MR and VR for both finger-pointing and cup-placement tasks. Six performance measures relevant to haptic interface design were measured for both tasks under several different visualization conditions (“MR with indicator,” “MR without indicator,” and “VR”) to determine what factors contribute to hand-reaching performance. A key finding was evidence of a trade-off between reaching “motion confidence” measures (indicated by throughput, number of corrective movements, and peak velocity) and “accuracy” measures (indicated by end-point error and initial movement error). Specifically, we observed that participants tended to be more confident in the “MR without Indicator” condition for finger-pointing tasks. These results contribute critical knowledge to inform the design of VR/MR interfaces based on the application's user performance requirements. 
    more » « less