skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Ani-Bot: A Modular Robotics System Supporting Creation, Tweaking, and Usage with Mixed-Reality Interactions
Ani-Bot is a modular robotics system that allows users to control their DIY robots using Mixed-Reality Interaction (MRI). This system takes advantage of MRI to enable users to visually program the robot through the augmented view of a Head-Mounted Display (HMD). In this paper, we first explain the design of the Mixed-Reality (MR) ready modular robotics system, which allows users to instantly perform MRI once they finish assembling the robot. Then, we elaborate the augmentations provided by the MR system in the three primary phases of a construction kit's lifecycle: Creation, Tweaking, and Usage. Finally, we demonstrate Ani-Bot with four application examples and evaluate the system with a two-session user study. The results of our evaluation indicate that Ani-Bot does successfully embed MRI into the lifecycle (Creation, Tweaking, Usage) of DIY robotics and that it does show strong potential for delivering an enhanced user experience.  more » « less
Award ID(s):
1632154
PAR ID:
10201789
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Publication:TEI '18: Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied Interaction (TEI 18)
Page Range / eLocation ID:
419 to 428
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We present Ani-Bot, a modular robotics system that allows users to construct Do-It-Yourself (DIY) robots and use mixed-reality approach to interact with them. Ani-Bot enables novel user experience by embedding Mixed-Reality Interaction (MRI) in the three phases of interacting with a modular construction kit, namely, Creation, Tweaking, and Usage. In this paper, we first present the system design that allows users to instantly perform MRI once they finish assembling the robot. Further, we discuss the augmentations offered by MRI in the three phases in specific. 
    more » « less
  2. Abstract Fluid power systems can be expensive and difficult to access, making it challenging to provide hands-on training. This work discusses the incorporation of Mixed Reality (MR) technology in Fluid Power applications for providing a virtual training environment that simulates the behavior of fluid power systems, allowing users to receive immediate feedback on the system’s performance. Mixed reality is a digitized-based technology that integrates a virtual environment with our real world by utilizing real-world sensor data and computer models. This technology allows running simulations that examine the complexity of highly-coupled systems, producing new digital environments where physical and digital elements can interact in real-time. With all these features, MR technology can be a practical training tool for running virtual simulations that mimic real-life industry settings. It can extend the user with a virtual training environment, thus preparing the next generation of fluid power engineers and specialists. Throughout this work, we present the development and capabilities of a digitized virtual copy of a hydraulic excavator’s arm in an MR environment as a proof of concept. The MR arm module is developed and deployed using Microsoft’s Mixed Reality Tool Kit (MRTK) for Unity through HoloLens 2 MR headset. The MR development involves generating virtual copies of the mechanical and hydraulic subsystems, conducting the virtual assembly, and creating a user interface in the MR environment to visualize and interact with the model. The developed MR module enables visualizing the excavator’s internal structure, conducting the virtual assembly, and running virtual simulations, all of which assist in training future fluid power operators. It is an effective training tool that helps train junior engineers/technicians, cutting down on cost and time. 
    more » « less
  3. Research has identified applications of handheld-based VR, which utilizes handheld displays or mobile devices, for developing systems that involve users in mixed reality (MR) without the need for head-worn displays (HWDs). Such systems can potentially accommodate large groups of users participating in MR. However, we lack an understanding of how group sizes and interaction methods affect the user experience. In this paper, we aim to advance our understanding of handheld-based MR in the context of multiplayer, co-located games. We conducted a study (N = 38) to understand how user experiences vary by group size (2, 4, and 8) and interaction method (proximity-based or pointing-based). For our experiment, we implemented a multiuser experience for up to ten users. We found that proximity-based interaction that encouraged dynamic movement positively affected social presence and physical/temporal workload. In bigger group settings, participants felt less challenged and less positive. Individuals had varying preferences for group size and interaction type. The findings of the study will advance our understanding of the design space for handheld-based MR in terms of group sizes and interaction schemes. To make our contributions explicit, we conclude our paper with design implications that can inform user experience design in handheld-based mixed reality contexts. 
    more » « less
  4. We investigate the effectiveness of robot-generated mixed reality gestures. Our findings demonstrate how these gestures increase user effectiveness by decreasing user response time, and that robots can pair long referring expressions with mixed reality gestures without cognitively overloading users. 
    more » « less
  5. We present the design of a mixed reality (MR) telehealth training system that aims to close the gap between in-person and distance training and re-training for medical procedures. Our system uses real-time volumetric capture as a means for communicating and relating spatial information between the non-colocated trainee and instructor. The system's design is based on a requirements elicitation study performed in situ, at a medical school simulation training center. The focus is on the lightweight real-time transmission of volumetric data - meaning the use of consumer hardware, easy and quick deployment, and low-demand computations. We evaluate the MR system design by analyzing the workload for the users during medical training. We compare in-person, video, and MR training workloads. The results indicate that the overall workload for central line placement training with MR does not increase significantly compared to video communication. Our work shows that, when designed strategically together with domain experts, an MR communication system can be used effectively for complex medical procedural training without increasing the overall workload for users significantly. Moreover, MR systems offer new opportunities for teaching due to spatial information, hand tracking, and augmented communication. 
    more » « less