skip to main content

Title: Investigation of unmanned aerial vehicle gesture perceptibility and impact of viewpoint variance
Unmanned Aerial Vehicle (UAV) flight paths have been shown to communicate meaning to human observers, similar to human gestural communication. This paper presents the results of a UAV gesture perception study designed to assess how observer viewpoint perspective may impact how humans perceive the shape of UAV gestural motion. Robot gesture designers have demonstrated that robots can indeed communicate meaning through gesture; however, many of these results are limited to an idealized range of viewer perspectives and do not consider how the perception of a robot gesture may suffer from obfuscation or self-occlusion from some viewpoints. This paper presents the results of three online user-studies that examine participants’ ability to accurately perceive the intended shape of two-dimensional UAV gestures from varying viewer perspectives. We used a logistic regression model to characterize participant gesture classification accuracy, demonstrating that viewer perspective does impact how participants perceive the shape of UAV gestures. Our results yielded a viewpoint angle threshold from beyond which participants were able to assess the intended shape of a gesture’s motion with 90% accuracy. We also introduce a perceptibility score to capture user confidence, time to decision, and accuracy in labeling and to understand how differences in flight paths impact more » perception across viewpoints. These findings will enable UAV gesture systems that, with a high degree of confidence, ensure gesture motions can be accurately perceived by human observers. « less
Authors:
; ;
Award ID(s):
1925368 1750750 1757908
Publication Date:
NSF-PAR ID:
10296987
Journal Name:
IEEE International Conference on Robotics and Automation
ISSN:
1049-3492
Sponsoring Org:
National Science Foundation
More Like this
  1. Unmanned Aerial Vehicle (UAV) flight paths have been shown to communicate meaning to human observers, similar to human gestural communication. This paper presents the results of a UAV gesture perception study designed to assess how observer viewpoint perspective may impact how humans perceive the shape of UAV gestural motion. Robot gesture designers have demonstrated that robots can indeed communicate meaning through gesture; however, many of these results are limited to an idealized range of viewer perspectives and do not consider how the perception of a robot gesture may suffer from obfuscation or self-occlusion from some viewpoints. This paper presents the results of three online user-studies that examine participants' ability to accurately perceive the intended shape of two-dimensional UAV gestures from varying viewer perspectives. We used a logistic regression model to characterize participant gesture classification accuracy, demonstrating that viewer perspective does impact how participants perceive the shape of UAV gestures. Our results yielded a viewpoint angle threshold from beyond which participants were able to assess the intended shape of a gesture's motion with 90% accuracy. We also introduce a perceptibility score to capture user confidence, time to decision, and accuracy in labeling and to understand how differences in flight paths impactmore »perception across viewpoints. These findings will enable UAV gesture systems that, with a high degree of confidence, ensure gesture motions can be accurately perceived by human observers.« less
  2. Unmanned aerial vehicles (UAVs) are becoming more common, presenting the need for effective human-robot communication strategies that address the unique nature of unmanned aerial flight. Visual communication via drone flight paths, also called gestures, may prove to be an ideal method. However, the effectiveness of visual communication techniques is dependent on several factors including an observer's position relative to a UAV. Previous work has studied the maximum line-of-sight at which observers can identify a small UAV [1]. However, this work did not consider how changes in distance may affect an observer's ability to perceive the shape of a UAV's motion. In this study, we conduct a series of online surveys to evaluate how changes in line-of-sight distance and gesture size affect observers' ability to identify and distinguish between UAV gestures. We first examine observers' ability to accurately identify gestures when adjusting a gesture's size relative to the size of a UAV. We then measure how observers' ability to identify gestures changes with respect to varying line-of-sight distances. Lastly, we consider how altering the size of a UAV gesture may improve an observer's ability to identify drone gestures from varying distances. Our results show that increasing the gesture size across varyingmore »UAV to gesture ratios did not have a significant effect on participant response accuracy. We found that between 17 m and 75 m from the observer, their ability to accurately identify a drone gesture was inversely proportional to the distance between the observer and the drone. Finally, we found that maintaining a gesture's apparent size improves participant response accuracy over changing line-of-sight distances.« less
  3. This article presents an understanding of naive users’ perception of the communicative nature of unmanned aerial vehicle (UAV) motions refined through an iterative series of studies. This includes both what people believe the UAV is trying to communicate, and how they expect to respond through physical action or emotional response. Previous work in this area prioritized gestures from participants to the vehicle or augmenting the vehicle with additional communication modalities, rather than communicating without clear definitions of the states attempting to be conveyed. In an attempt to elicit more concrete states and better understand specific motion perception, this work includes multiple iterations of state creation, flight path refinement, and label assignment. The lessons learned in this work will be applicable broadly to those interested in defining flight paths, and within the human-robot interaction community as a whole, as it provides a base for those seeking to communicate using non-anthropomorphic robots. We found that the Negative Attitudes towards Robots Scale (NARS) can be an indicator of how a person is likely to react to a UAV, the emotional content they are likely to perceive from a message being conveyed, and it is an indicator for the personality characteristics they are likelymore »to project upon the UAV. We also see that people commonly associate motions from other non-verbal communication situations onto UAVs. Flight specific recommendations are to use a dynamic retreating motion from a person to encourage following, use a perpendicular motion to their field of view for blocking, simple descending motion for landing, and to use either no motion or large altitude changes to encourage watching. Overall, this research explores the communication from the UAV to the bystander through its motion, to see how people respond physically and emotionally.« less
  4. This work has developed an iteratively refined understanding of participants’ natural perceptions and responses to unmanned aerial vehicle (UAV) flight paths, or gestures. This includes both what they believe the UAV is trying to communicate to them, in addition to how they expect to respond through physical action. Previous work in this area has focused on eliciting gestures from participants to communicate specific states, or leveraging gestures that are observed in the world rather than on understanding what the participants believe is being communicated and how they would respond. This work investigates previous gestures either created or categorized by participants to understand the perceived content of their communication or expected response, through categories created by participant free responses and confirmed through forced choice testing. The human-robot interaction community can leverage this work to better understand how people perceive UAV flight paths, inform future designs for non-anthropomorphic robot communications, and apply lessons learned to elicit informative labels from people who may or may not be operating the vehicle. We found that the Negative Attitudes towards Robots Scale (NARS) can be a good indicator of how we can expect a person to react to a robot. Recommendations are also provided to usemore »motion approaching/retreating from a person to encourage following, perpendicular to their field of view for blocking, and to use either no motion or large altitude changes to encourage viewing.« less
  5. This paper presents a gesture set for communicating states to novice users from a small Unmanned Aerial System (sUAS) through an elicitation study comparing gestures created by participants recruited from the general public with varying levels of experience with an sUAS. Previous work in sUAS flight paths sought to communicate intent, destination, or emotion without focusing on concrete states such as Low Battery or Landing. This elicitation study uses a participatory design approach from human-computer interaction to understand how novice users would expect an sUAS to communicate states, and ultimately suggests flight paths and characteristics to indicate those states. We asked users from the general public (N=20) to create gestures for seven distinct sUAS states to provide insights for human-drone interactions and to present intuitive flight paths and characteristics with the expectation that the sUAS would have general commercial application for inexperienced users. The results indicate relatively strong agreement scores for three sUAS states: Landing (0.455), Area of Interest (0.265), and Low Battery (0.245). The other four states have lower agreement scores, however even they show some consensus for all seven states. The agreement scores and the associated gestures suggest guidance for engineers to develop a common set of flightmore »paths and characteristics for an sUAS to communicate states to novice users.« less