skip to main content


Title: A Taxonomy for Selecting Wearable Input Devices for Mixed Reality
Composite wearable computers consist of multiple wearable devices connected together and working as a cohesive whole. These composite wearable computers are promising for augmenting our interaction with the physical, virtual, and mixed play spaces (e.g., mixed reality games). Yet little research has directly addressed how mixed reality system designers can select wearable input devices and how these devices can be assembled together to form a cohesive wearable computer. We present an initial taxonomy of wearable input devices to aid designers in deciding which devices to select and assemble together to support different mixed reality systems. We undertook a grounded theory analysis of 84 different wearable input devices resulting in a design taxonomy for composite wearable computers. The taxonomy consists of two axes: TYPE OF INTERACTIVITY and BODY LOCATION. These axes enable designers to identify which devices fill particular needs in the system development process and how these devices can be assembled together to form a cohesive wearable computer.  more » « less
Award ID(s):
1651532 1619273
NSF-PAR ID:
10174278
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Proceedings of the 2019 ACM International Conference on Interactive Surfaces and Spaces
Page Range / eLocation ID:
403 - 408
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Composite wearable computers combine multiple wearable devices to form a cohesive whole. Designing these complex systems and integrating devices to effectively leverage their affordances is nontrivial. To inform the design of composite wearable computers, we undertook a grounded theory analysis of 84 wearable input devices drawing from 197 data sources, including technical specifications, research papers, and instructional videos. The resulting prescriptive design framework consists of four axes: type of interactivity, associated output modalities, mobility, and body location. This framework informs a composition-based approach to the design of wearable computers, enabling designers to identify which devices fill particular user needs and design constraints. Using this framework, designers can understand the relationship between the wearable, the user, and the environment, identify limitations in available wearable devices, and gain insights into how to address design challenges developers will likely encounter. 
    more » « less
  2. A primary component of disaster response is training. These educational exercises provide responders with the knowledge and skills needed to be prepared when disasters happen. However, traditional training methods, such as high-fidelity simulations (e.g., real-life drills) and classroom courses, may fall short of providing effective and cost-efficient training that is needed for today‚Äôs challenges. Advances in technology open a wide range of opportunities for training using computer-mediated simulations and exercises. These exercises include the use of mixed reality games and wearable computers. Existing studies report on the usefulness of these technologies for training purposes. This review paper synthesizes prior research and development of disaster response simulations and identifies challenges, opportunities, and lessons learned. Through this review, we provide researchers and designers with an overview of current practices in designing training simulations and contribute practical insights into the design of future disaster response training. 
    more » « less
  3. null (Ed.)
    Autonomous robotic vehicles (i.e., drones) are potentially transformative for search and rescue (SAR). This paper works toward wearable interfaces, through which humans team with multiple drones. We introduce the Virtual Drone Search Game as a first step in creating a mixed reality simulation for humans to practice drone teaming and SAR techniques. Our goals are to (1) evaluate input modalities for the drones, derived from an iterative narrowing of the design space, (2) improve our mixed reality system for designing input modalities and training operators, and (3) collect data on how participants socially experience the virtual drones with which they work. In our study, 17 participants played the game with two input modalities (Gesture condition, Tap condition) in counterbalanced order. Results indicated that participants performed best with the Gesture condition. Participants found the multiple controls challenging, and future studies might include more training of the devices and game. Participants felt like a team with the drones and found them moderately agentic. In our future work, we will extend this testing to a more externally valid mixed reality game. 
    more » « less
  4. Researchers, educators, and multimedia designers need to better understand how mixing physical tangible objects with virtual experiences affects learning and science identity. In this novel study, a 3D-printed tangible that is an accurate facsimile of the sort of expensive glassware that chemists use in real laboratories is tethered to a laptop with a digitized lesson. Interactive educational content is increasingly being placed online, it is important to understand the educational boundary conditions associated with passive haptics and 3D-printed manipulables. Cost-effective printed objects would be particularly welcome in rural and low Socio-Economic (SES) classrooms. A Mixed Reality (MR) experience was created that used a physical 3D-printed haptic burette to control a computer-based chemistry titration experiment. This randomized control trial study with 136 college students had two conditions: 1) low-embodied control (using keyboard arrows), and 2) high-embodied experimental (physically turning a valve/stopcock on the 3D-printed burette). Although both groups displayed similar significant gains on the declarative knowledge test, deeper analyses revealed nuanced Aptitude by Treatment Interactions (ATIs). These interactionsfavored the high-embodied experimental group that used the MR devicefor both titration-specific posttest knowledge questions and for science efficacy and science identity. Those students with higher prior science knowledge displayed higher titration knowledge scores after using the experimental 3D-printed haptic device. A multi-modal linguistic and gesture analysis revealed that during recall the experimental participants used the stopcock-turning gesture significantly more often, and their recalls created a significantly different Epistemic Network Analysis (ENA). ENA is a type of 2D projection of the recall data, stronger connections were seen in the high embodied group mainly centering on the key hand-turning gesture. Instructors and designers should consider the multi-modal and multi-dimensional nature of the user interface, and how the addition of another sensory-based learning signal (haptics) might differentially affect lower prior knowledge students. One hypothesis is that haptically manipulating novel devices during learning may create more cognitive load. For low prior knowledge students, it may be advantageous for them to begin learning content on a more ubiquitous interface (e.g., keyboard) before moving them to more novel, multi-modal MR devices/interfaces.

     
    more » « less
  5. Wearable computers are poised to impact disaster response, so there is a need to determine the best interfaces to support situation awareness, decision support, and communication. We present a disaster response wearable design created for a mixed reality live-action role playing design competition, the Icehouse Challenge. The challenge, an independent event in which the authors were competitors, offers a simulation game environment in which teams compete to test wearable designs. In this game, players move through a simulated disaster space that requires team coordination and physical exertion to mitigate virtual hazards and stabilize virtual victims. Our design was grounded in disaster response and team coordination practice. We present our design process to develop wearable computer interfaces that integrate physiological and virtual environmental sensor data and display actionable information through a head-mounted display. We reflect on our observations from the live game, discuss challenges, opportunities, and design implications for future disaster response wearables to support collaboration. 
    more » « less