skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 11:00 PM ET on Thursday, October 10 until 2:00 AM ET on Friday, October 11 due to maintenance. We apologize for the inconvenience.


Title: Measuring and Comparing Collaborative Visualization Behaviors in Desktop and Augmented Reality Environments
Augmented reality (AR) provides a significant opportunity to improve collaboration between co-located team members jointly analyzing data visualizations, but existing rigorous studies are lacking. We present a novel method for qualitatively encoding the positions of co-located users collaborating with head-mounted displays (HMDs) to assist in reliably analyzing collaboration styles and behaviors. We then perform a user study on the collaborative behaviors of multiple, co-located synchronously collaborating users in AR to demonstrate this method in practice and contribute to the shortfall of such studies in the existing literature. Pairs of users performed analysis tasks on several data visualizations using both AR and traditional desktop displays. To provide a robust evaluation, we collected several types of data, including software logging of participant positioning, qualitative analysis of video recordings of participant sessions, and pre- and post-study questionnaires including the NASA TLX survey. Our results suggest that the independent viewports of AR headsets reduce the need to verbally communicate about navigating around the visualization and encourage face-to-face and non-verbal communication. Our novel positional encoding method also revealed the overlap of task and communication spaces vary based on the needs of the collaborators.  more » « less
Award ID(s):
1828010 2216452
NSF-PAR ID:
10517474
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
ACM
Date Published:
Journal Name:
29th ACM Symposium on Virtual Reality Software and Technology
ISBN:
9798400703287
Page Range / eLocation ID:
1 to 11
Format(s):
Medium: X
Location:
Christchurch New Zealand
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Augmented reality (AR) applications are growing in popularity in educational settings. While the effects of AR experiences on learning have been widely studied, there is relatively less research on understanding the impact of AR on the dynamics of co-located collaborative learning, specifically in the context of novices programming robots. Educational robotics are a powerful learning context because they engage students with problem solving, critical thinking, STEM (Science, Technology, Engineering, Mathematics) concepts, and collaboration skills. However, such collaborations can suffer due to students having unequal access to resources or dominant peers. In this research we investigate how augmented reality impacts learning and collaboration while peers engage in robot programming activities. We use a mixed methods approach to measure how participants are learning, manipulating resources, and engaging in problem solving activities with peers. We investigate how these behaviors are impacted by the presence of augmented reality visualizations, and by participants? proximity to resources. We find that augmented reality improved overall group learning and collaboration. Detailed analysis shows that AR strongly helps one participant more than the other, by improving their ability to learn and contribute while remaining engaged with the robot. Furthermore, augmented reality helps both participants maintain a common ground and balance contributions during problem solving activities. We discuss the implications of these results for designing AR and non-AR collaborative interfaces. 
    more » « less
  2. null (Ed.)
    We discuss and present design probes investigating how pervasive displays could offer unique opportunities for enhancing discovery and learning with “big data.” Our collaboration across three universities undertook a series of design exercises investigating approaches for collaborative, interactive, tangibles, and multitouch-engaged visualizations of genomic and related scientific datasets. These exercises led to several envisionments of tangible interfaces that employ active tokens and interactive surfaces to facilitate co-located and distributed engagement with large datasets. We describe some of the motivation and background for these envisioned interfaces; consider key aspects linking and distinguishing the designs; and relate these to the present and near-future state of the art for tangible and multitouch engagement with pervasive displays toward collaborative science. 
    more » « less
  3. This article discusses novel research methods used to examine how Augmented Reality (AR) can be utilized to present “omic” (i.e., genomes, microbiomes, pathogens, allergens) information to non-expert users. While existing research shows the potential of AR as a tool for personal health, methodological challenges pose a barrier to the ways in which AR research can be conducted. There is a growing need for new evaluation methods for AR systems, especially as remote testing becomes increasingly popular. In this article, we present two AR studies adapted for remote research environments in the context of personal health. The first study ( n = 355) is a non-moderated remote study conducted using an AR web application to explore the effect of layering abstracted pathogens and mitigative behaviors on a user, on perceived risk perceptions, negative affect, and behavioral intentions. This study introduces methods that address participant precursor requirements, diversity of platforms for delivering the AR intervention, unsupervised setups, and verification of participation as instructed. The second study ( n = 9) presents the design and moderated remote evaluation of a technology probe, a prototype of a novel AR tool that overlays simulated timely and actionable environmental omic data in participants' living environment, which helps users to contextualize and make sense of the data. Overall, the two studies contribute to the understanding of investigating AR as a tool for health behavior and interventions for remote, at-home, empirical studies. 
    more » « less
  4. Immersive data-driven storytelling, which uses interactive immersive visualizations to present insights from data, is a compelling use case for VR and AR environments. We present XRCreator, an authoring system to create immersive data-driven stories. The cross-platform nature of our React-inspired system architecture enables the collaboration among VR, AR, and web users, both in authoring and in experiencing immersive data-driven stories. 
    more » « less
  5. Deaf and Hard-of-Hearing (DHH) users face accessibility challenges during in-person and remote meetings. While emerging use of applications incorporating automatic speech recognition (ASR) is promising, more user-interface and user-experience research is needed. While co-design methods could elucidate designs for such applications, COVID-19 has interrupted in-person research. This study describes a novel methodology for conducting online co-design workshops with 18 DHH and hearing participant pairs to investigate ASR-supported mobile and videoconferencing technologies along two design dimensions: Correcting errors in ASR output and implementing notification systems for influencing speaker behaviors. Our methodological findings include an analysis of communication modalities and strategies participants used, use of an online collaborative whiteboarding tool, and how participants reconciled differences in ideas. Finally, we present guidelines for researchers interested in online DHH co-design methodologies, enabling greater geographically diversity among study participants even beyond the current pandemic. 
    more » « less