- Award ID(s):
- 2008904
- Publication Date:
- NSF-PAR ID:
- 10296731
- Journal Name:
- 2021 IEEE Conference on Aerospace
- Sponsoring Org:
- National Science Foundation
More Like this
-
This paper presents a novel strategy for the autonomous deployment of Micro Aerial Vehicle scouts through constricted aperture-like ingress points, by narrowly fitting and launching them with a high-precision Mobile Manipulation robot. A significant problem during exploration and reconnaissance into highly unstructured environments, such as indoor collapsed ones, is the encountering of impassable areas due to their constricted and rigid nature. We propose that a heterogeneous robotic system-of-systems armed with manipulation capabilities while also ferrying a fleet of micro-sized aerial agents, can deploy the latter through constricted apertures that marginally fit them in size, thus allowing them to act as scouts and resume the reconnaissance mission. This work's contribution is twofold: first, it proposes active-vision based aperture detection to locate candidate ingress points and a hierarchical search-based aperture profile analysis to position a MAV's body through them, and secondly it presents and experimentally demonstrates the novelty of a system-of-systems approach which leverages mobile manipulation to deploy other robots which are otherwise incapable of entering through extremely narrow openings.
-
This work considers autonomous fruit picking using an aerial grasping robot by tightly integrating vision-based perception and control within a learning framework. The architecture employs a convolutional neural network (CNN) to encode images and vehicle state information. This encoding is passed into a sub-task classifier and associated reference waypoint generator. The classifier is trained to predict the current phase of the task being executed: Staging, Picking, or Reset. Based on the predicted phase, the waypoint generator predicts a set of obstacle-free 6-DOF waypoints, which serve as a reference trajectory for model-predictive control (MPC). By iteratively generating and following these trajectories, the aerial manipulator safely approaches a mock-up goal fruit and removes it from the tree. The proposed approach is validated in 29 flight tests, through a comparison to a conventional baseline approach, and an ablation study on its key features. Overall, the approach achieved comparable success rates to the conventional approach, while reaching the goal faster.
-
The deep chlorophyll maximum (DCM) layer is an ecologically important feature of the open ocean. The DCM cannot be observed using aerial or satellite remote sensing; thus, in situ observations are essential. Further, understanding the responses of microbes to the environmental processes driving their metabolism and interactions requires observing in a reference frame that moves with a plankton population drifting in ocean currents, i.e., Lagrangian. Here, we report the development and application of a system of coordinated robots for studying planktonic biological communities drifting within the ocean. The presented Lagrangian system uses three coordinated autonomous robotic platforms. The focal platform consists of an autonomous underwater vehicle (AUV) fitted with a robotic water sampler. This platform localizes and drifts within a DCM community, periodically acquiring samples while continuously monitoring the local environment. The second platform is an AUV equipped with environmental sensing and acoustic tracking capabilities. This platform characterizes environmental conditions by tracking the focal platform and vertically profiling in its vicinity. The third platform is an autonomous surface vehicle equipped with satellite communications and subsea acoustic tracking capabilities. While also acoustically tracking the focal platform, this vehicle serves as a communication relay that connects the subsea robot to human operators,more »
-
Abstract We autonomously directed a small quadcopter package delivery Uncrewed Aerial Vehicle (UAV) or “drone” to take off, fly a specified route, and land for a total of 209 flights while varying a set of operational parameters. The vehicle was equipped with onboard sensors, including GPS, IMU, voltage and current sensors, and an ultrasonic anemometer, to collect high-resolution data on the inertial states, wind speed, and power consumption. Operational parameters, such as commanded ground speed, payload, and cruise altitude, were varied for each flight. This large data set has a total flight time of 10 hours and 45 minutes and was collected from April to October of 2019 covering a total distance of approximately 65 kilometers. The data collected were validated by comparing flights with similar operational parameters. We believe these data will be of great interest to the research and industrial communities, who can use the data to improve UAV designs, safety, and energy efficiency, as well as advance the physical understanding of in-flight operations for package delivery drones.
-
Ishigami G., Yoshida K. (Ed.)This paper develops an autonomous tethered aerial visual assistant for robot operations in unstructured or confined environments. Robotic tele-operation in remote environments is difficult due to the lack of sufficient situational awareness, mostly caused by stationary and limited field-of-view and lack of depth perception from the robot’s onboard camera. The emerging state of the practice is to use two robots, a primary and a secondary that acts as a visual assistant to overcome the perceptual limitations of the onboard sensors by providing an external viewpoint. However, problems exist when using a tele-operated visual assistant: extra manpower, manually chosen suboptimal viewpoint, and extra teamwork demand between primary and secondary operators. In this work, we use an autonomous tethered aerial visual assistant to replace the secondary robot and operator, reducing the human-robot ratio from 2:2 to 1:2. This visual assistant is able to autonomously navigate through unstructured or confined spaces in a risk-aware manner, while continuously maintaining good viewpoint quality to increase the primary operator’s situational awareness. With the proposed co-robots team, tele-operation missions in nuclear operations, bomb squad, disaster robots, and other domains with novel tasks or highly occluded environments could benefit from reduced manpower and teamwork demand, along with improvedmore »