skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 10:00 PM ET on Friday, February 6 until 10:00 AM ET on Saturday, February 7 due to maintenance. We apologize for the inconvenience.


Title: Understanding the Influence of Fatigue on Full Arm Gestures in Augmented Reality Environments
This research investigates fatigue’s impact on arm gestures within augmented reality environments. Through the analysis of the gathered data, our goal is to develop a comprehensive understanding of the constraints and unique characteristics affecting the performance of arm gestures when individuals are fatigued. Based on our findings, prolonged engagement in full-arm movement gestures under the influence of fatigue resulted in a decline in muscle strength within upper body segments. Thus, this decline led to a notable reduction in the accuracy of gesture detection in the AR environment, dropping from an initial 97.7% to 75.9%. We also found that changes in torso movements can have a ripple effect on the upper and forearm regions. This valuable knowledge will enable us to enhance our gesture detection algorithms, thereby enhancing their precision and accuracy, even in fatigue-related situations.  more » « less
Award ID(s):
2202108
PAR ID:
10643922
Author(s) / Creator(s):
 ;  ;  ;  
Publisher / Repository:
Sage Journals
Date Published:
Journal Name:
Proceedings of the Human Factors and Ergonomics Society Annual Meeting
Volume:
68
Issue:
1
ISSN:
1071-1813
Page Range / eLocation ID:
1194 to 1199
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Mixed Reality visualizations provide a powerful new approach for enabling gestural capabilities on non-humanoid robots. This paper explores two different categories of mixed-reality deictic gestures for armless robots: a virtual arrow positioned over a target referent (a non-ego-sensitive allocentric gesture) and a virtual arm positioned over the gesturing robot (an ego-sensitive allocentric gesture). Specifically, we present the results of a within-subjects Mixed Reality HRI experiment (N=23) exploring the trade-offs between these two types of gestures with respect to both objective performance and subjective social perceptions. Our results show a clear trade-off between performance and social perception, with non-ego-sensitive allocentric gestures enabling faster reaction time and higher accuracy, but ego-sensitive gestures enabling higher perceived social presence, anthropomorphism, and likability. 
    more » « less
  2. Mixed Reality provides a powerful medium for transparent and effective human-robot communication, especially for robots with significant physical limitations (e.g., those without arms). To enhance nonverbal capabilities for armless robots, this article presents two studies that explore two different categories of mixed reality deictic gestures for armless robots: a virtual arrow positioned over a target referent (a non-ego-sensitive allocentric gesture) and a virtual arm positioned over the gesturing robot (an ego-sensitive allocentric gesture). In Study 1, we explore the tradeoffs between these two types of gestures with respect to both objective performance and subjective social perceptions. Our results show fundamentally different task-oriented versus social benefits, with non-ego-sensitive allocentric gestures enabling faster reaction time and higher accuracy, but ego-sensitive gestures enabling higher perceived social presence, anthropomorphism, and likability. In Study 2, we refine our design recommendations by showing that in fact these different gestures should not be viewed as mutually exclusive alternatives, and that by using them together, robots can achieve both task-oriented and social benefits. 
    more » « less
  3. Understanding abstract concepts in mathematics has continuously presented as a challenge, but the use of directed and spontaneous gestures has shown to support learning and ground higher-order thought. Within embodied learning, gesture has been investigated as part of a multimodal assemblage with speech and movement, centering the body in interaction with the environment. We present a case study of one dyad’s undertaking of a robotic arm activity, targeting learning outcomes in matrix algebra, robotics, and spatial thinking. Through a body syntonicity lens and drawing on video and pre- and post- assessment data, we evaluate learning gains and investigate the multimodal processes contributing to them. We found gesture, speech, and body movement grounded understanding of vector and matrix operations, spatial reasoning, and robotics, as anchored by the physical robotic arm, with implications for the design of learning environments that employ directed gestures. 
    more » « less
  4. Mixed reality visualizations provide a powerful new approach for enabling gestural capabilities for non-humanoid robots. This paper explores two different categories of mixed-reality deictic gestures for armless robots: a virtual arrow positioned over a target referent (a non-ego-sensitive allocentric gesture) and a virtual arrow positioned over the robot (an ego-sensitive allocentric gesture). We explore the trade-offs between these two types of gestures, with respect to both objective performance and subjective social perceptions. We conducted a 24-participant within-subjects experiment in which a HoloLens-wearing participant interacted with a robot that used these two types of gestures to refer to objects at two different distances. Our results demonstrate a clear trade-off between performance and social perception: non-ego-sensitive allocentric gestures led to quicker reaction time and higher accuracy, but ego-sensitive gesture led to higher perceived social presence, anthropomorphism, and likability. These results present a challenging design decision to creators of mixed reality robotic systems 
    more » « less
  5. We compare the perceived naturalness of character animations generated using three interpolation methods: linear Euler, spherical linear quaternion, and spherical spline quaternion. While previous work focused on the mathematical description of these interpolation types, our work studies the perceptual evaluation of animated upper body character gestures generated using these interpolations. Ninety-seven participants watched 12 animation clips of a character performing four different upper body motions: a beat gesture, a deictic gesture, an iconic gesture, and a metaphoric gesture. Three animation clips were generated for each gesture using the three interpolation methods. The participants rated their naturalness on a 5-point Likert scale. The results showed that animations generated using spherical spline quaternion interpolation were perceived as significantly more natural than those generated using the other two interpolation methods. The findings held true for all subjects regardless of gender and animation experience and across all four gestures. 
    more » « less