skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Social interaction in augmented reality
There have been decades of research on the usability and educational value of augmented reality. However, less is known about how augmented reality affects social interactions. The current paper presents three studies that test the social psychological effects of augmented reality. Study 1 examined participants’ task performance in the presence of embodied agents and replicated the typical pattern of social facilitation and inhibition. Participants performed a simple task better, but a hard task worse, in the presence of an agent compared to when participants complete the tasks alone. Study 2 examined nonverbal behavior. Participants met an agent sitting in one of two chairs and were asked to choose one of the chairs to sit on. Participants wearing the headset never sat directly on the agent when given the choice of two seats, and while approaching, most of the participants chose the rotation direction to avoid turning their heads away from the agent. A separate group of participants chose a seat after removing the augmented reality headset, and the majority still avoided the seat previously occupied by the agent. Study 3 examined the social costs of using an augmented reality headset with others who are not using a headset. Participants talked in dyads, and augmented reality users reported less social connection to their partner compared to those not using augmented reality. Overall, these studies provide evidence suggesting that task performance, nonverbal behavior, and social connectedness are significantly affected by the presence or absence of virtual content.  more » « less
Award ID(s):
1839974
PAR ID:
10104458
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
PloS one
ISSN:
1932-6203
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Mixed Reality provides a powerful medium for transparent and effective human-robot communication, especially for robots with significant physical limitations (e.g., those without arms). To enhance nonverbal capabilities for armless robots, this article presents two studies that explore two different categories of mixed reality deictic gestures for armless robots: a virtual arrow positioned over a target referent (a non-ego-sensitive allocentric gesture) and a virtual arm positioned over the gesturing robot (an ego-sensitive allocentric gesture). In Study 1, we explore the tradeoffs between these two types of gestures with respect to both objective performance and subjective social perceptions. Our results show fundamentally different task-oriented versus social benefits, with non-ego-sensitive allocentric gestures enabling faster reaction time and higher accuracy, but ego-sensitive gestures enabling higher perceived social presence, anthropomorphism, and likability. In Study 2, we refine our design recommendations by showing that in fact these different gestures should not be viewed as mutually exclusive alternatives, and that by using them together, robots can achieve both task-oriented and social benefits. 
    more » « less
  2. We present our work in progress, a real-time mixed reality communication system for remote assistance in medical emergency situations. 3D cameras capture the emergency situation and send volumetric data to a remote expert. The remote expert sees the volumetric scene through mixed reality glasses and guides an operator to the patient. The local operator receives audio and visual guidance augmented onto the mixed reality headset. We compare the mixed reality system against traditional video communication in a user study on a CPR emergency simulation. We evaluate task performance, cognitive load, and user interaction. The results will help to better understand the benefits of using augmented and volumetric information in medical emergency procedures. 
    more » « less
  3. We present our work in progress, a real-time mixed reality communication system for remote assistance in medical emergency situations. 3D cameras capture the emergency situa-tion and send volumetric data to a remote expert. The remote expert sees the volumetric scene through mixed reality glasses and guides an operator at the patient. The local operator receives audio and visual guidance augmented onto the mixed reality headset. We compare the mixed reality system against traditional video communication in a user study on a CPR emergency simulation. We evaluate task performance, cognitive load, and user interaction. The results will help to better understand the benefits of using augmented and volumetric information in medical emergency procedures. 
    more » « less
  4. An overarching goal of Artificial Intelligence (AI) is creating autonomous, social agents that help people. Two important challenges, though, are that different people prefer different assistance from agents and that preferences can change over time. Thus, helping behaviors should be tailored to how an individual feels during the interaction. We hypothesize that human nonverbal behavior can give clues about users' preferences for an agent's helping behaviors, augmenting an agent's ability to computationally predict such preferences with machine learning models. To investigate our hypothesis, we collected data from 194 participants via an online survey in which participants were recorded while playing a multiplayer game. We evaluated whether the inclusion of nonverbal human signals, as well as additional context (e.g., via game or personality information), led to improved prediction of user preferences between agent behaviors compared to explicitly provided survey responses. Our results suggest that nonverbal communication -- a common type of human implicit feedback -- can aid in understanding how people want computational agents to interact with them. 
    more » « less
  5. In this study, we demonstrate an application for 5G networks in mobile and remote GPR scanning situations to detect buried objects by experts while the operator is performing the scans. Using a GSSI SIR-30 system in conjunction with the RealSense camera for visual mapping of the surveyed area, subsurface GPR scans were created and transmitted for remote processing. Using mobile networks, the raw B-scan files were transmitted at a sufficient rate, a maximum of 0.034 ms mean latency, to enable near real-time edge processing. The performance of 5G networks in handling the data transmission for the GPR scans and edge computing was compared to the performance of 4G networks. In addition, long-range low-power devices, namely Wi-Fi HaLow and Wi-Fi hotspots, were compared as local alternatives to cellular networks. Augmented reality headset representation of the F-scans is proposed as a method of assisting the operator in using the edge-processed scans. These promising results bode well for the potential of remote processing of GPR data in augmented reality applications. 
    more » « less