skip to main content

Search for: All records

Award ID contains: 1925052

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Unmanned aerial vehicles (UAVs) are becoming more common, presenting the need for effective human-robot communication strategies that address the unique nature of unmanned aerial flight. Visual communication via drone flight paths, also called gestures, may prove to be an ideal method. However, the effectiveness of visual communication techniques is dependent on several factors including an observer's position relative to a UAV. Previous work has studied the maximum line-of-sight at which observers can identify a small UAV [1]. However, this work did not consider how changes in distance may affect an observer's ability to perceive the shape of a UAV's motion. In this study, we conduct a series of online surveys to evaluate how changes in line-of-sight distance and gesture size affect observers' ability to identify and distinguish between UAV gestures. We first examine observers' ability to accurately identify gestures when adjusting a gesture's size relative to the size of a UAV. We then measure how observers' ability to identify gestures changes with respect to varying line-of-sight distances. Lastly, we consider how altering the size of a UAV gesture may improve an observer's ability to identify drone gestures from varying distances. Our results show that increasing the gesture size across varying UAV to gesture ratios did not have a significant effect on participant response accuracy. We found that between 17 m and 75 m from the observer, their ability to accurately identify a drone gesture was inversely proportional to the distance between the observer and the drone. Finally, we found that maintaining a gesture's apparent size improves participant response accuracy over changing line-of-sight distances. 
    more » « less
  2. Siciliano, B. ; Laschi, C. ; Khatib, O. (Ed.)
    Multirotor systems have traditionally been employed for missions that ensure minimal contact with the objects in their vicinity. However, their agile flight dynamics lets them sense, plan and react rapidly, and therefore perform highly dynamic missions. In this work, we push their operational envelope further by developing a complete framework that allows a multirotor to dock with a moving platform. Our approach builds on state-of-the-art and optimal methods for estimating and predicting the state of the moving platform, as well as for generating interception trajectories for the docking multirotor. Through a total of 25 field tests outdoors, we demonstrate the capabilities of our system in docking with a platform moving at different speeds and in various operating conditions. We also evaluate the quality of our system’s trajectory following at speeds over 2 m/s to effect docking within 10 s. 
    more » « less
  3. null (Ed.)
    Abstract. This paper describes the data collected by the University of Nebraska-Lincoln (UNL) as part of the field deployments during the Lower Atmospheric Process Studies at Elevation – a Remotely-piloted Aircraft Team Experiment (LAPSE-RATE) flight campaign in July 2018.The UNL deployed two multirotor unmanned aerial systems (UASs) at multiple sites in the San Luis Valley (Colorado, USA) for data collection to support three science missions: convection initiation, boundary layer transition, and cold air drainage flow.We conducted 172 flights resulting in over 21 h of cumulative flight time.Our novel design for the sensor housing onboard the UAS was employed in these flights to meet the aspiration and shielding requirements of the temperature and humidity sensors and to separate them from the mixed turbulent airflow from the propellers.Data presented in this paper include timestamped temperature and humidity data collected from the sensors, along with the three-dimensional position and velocity of the UAS.Data are quality-controlled and time-synchronized using a zero-order-hold interpolation without additional post-processing.The full dataset is also made available for download at (Islam et al., 2020). 
    more » « less
  4. null (Ed.)
  5. null (Ed.)