With the rapid development of technology and the proliferation of uncrewed aerial systems (UAS), there is an immediate need for security solutions. Toward this end, we propose the use of a multi-robot system for autonomous and cooperative counter-UAS missions. In this paper, we present the design of the hardware and software components of different complementary robotic platforms: a mobile uncrewed ground vehicle (UGV) equipped with a LiDAR sensor, an uncrewed aerial vehicle (UAV) with a gimbal-mounted stereo camera for air-to-air inspections, and a UAV with a capture mechanism equipped with radars and camera. Our proposed system features 1) scalability to larger areas due to the distributed approach and online processing, 2) long-term cooperative missions, and 3) complementary multimodal perception for the detection of multirotor UAVs. In field experiments, we demonstrate the integration of all subsystems in accomplishing a counter-UAS task within an unstructured environment. The obtained results confirm the promising direction of using multi-robot and multi-modal systems for C-UAS.
more »
« less
Bird's eye view: Cooperative exploration by UGV and UAV
This paper proposes a solution to the problem of cooperative exploration using an Unmanned Ground Vehicle (UGV) and an Unmanned Aerial Vehicle (UAV). More specifically, the UGV navigates through the free space, and the UAV provides enhanced situational awareness via its higher vantage point. The motivating application is search and rescue in a damaged building. A camera atop the UGV is used to track a fiducial tag on the underside of the UAV, allowing the UAV to maintain a fixed pose relative to the UGV. Furthermore, the UAV uses its front facing camera to provide a birds-eye-view to the remote operator, allowing for observation beyond obstacles that obscure the UGV's sensors. The proposed approach has been tested using a TurtleBot 2 equipped with a Hokuyo laser ranger finder and a Parrot Bebop 2. Experimental results demonstrate the feasibility of this approach. This work is based on several open source packages and the generated code is available on-line.
more »
« less
- Award ID(s):
- 1637876
- PAR ID:
- 10127556
- Date Published:
- Journal Name:
- International Conference on Unmanned Aircraft Systems (ICUAS)
- Page Range / eLocation ID:
- 247 to 255
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract This paper discusses the results of a field experiment conducted at Savannah River National Laboratory to test the performance of several algorithms for the localization of radioactive materials. In this multirobot system, both an unmanned aerial vehicle, a custom hexacopter, and an unmanned ground vehicle (UGV), the ClearPath Jackal, equipped withγ‐ray spectrometers, were used to collect data from two radioactive source configurations. Both the Fourier scattering transform and the Laplacian eigenmap algorithms for source detection were tested on the collected data sets. These algorithms transform raw spectral measurements into alternate spaces to allow clustering to detect trends within the data which indicate the presence of radioactive sources. This study also presents a point source model and accompanying information‐theoretic active exploration algorithm. Field testing validated the ability of this model to fuse aerial and ground collected radiation measurements, and the exploration algorithm’s ability to select informative actions to reduce model uncertainty, allowing the UGV to locate radioactive material online.more » « less
-
null (Ed.)Human action recognition is an important topic in artificial intelligence with a wide range of applications including surveillance systems, search-and-rescue operations, human-computer interaction, etc. However, most of the current action recognition systems utilize videos captured by stationary cameras. Another emerging technology is the use of unmanned ground and aerial vehicles (UAV/UGV) for different tasks such as transportation, traffic control, border patrolling, wild-life monitoring, etc. This technology has become more popular in recent years due to its affordability, high maneuverability, and limited human interventions. However, there does not exist an efficient action recognition algorithm for UAV-based monitoring platforms. This paper considers UAV-based video action recognition by addressing the key issues of aerial imaging systems such as camera motion and vibration, low resolution, and tiny human size. In particular, we propose an automated deep learning-based action recognition system which includes the three stages of video stabilization using the SURF feature selection and Lucas-Kanade method, human action area detection using faster region-based convolutional neural networks (R-CNN), and action recognition. We propose a novel structure that extends and modifies the InceptionResNet-v2 architecture by combining a 3D CNN architecture and a residual network for action recognition. We achieve an average accuracy of 85.83% for the entire-video-level recognition when applying our algorithm to the popular UCF-ARG aerial imaging dataset. This accuracy significantly improves upon the state-of-the-art accuracy by a margin of 17%.more » « less
-
The advancement of 3D modeling applications in various domains has been significantly propelled by innovations in 3D computer vision models. However, the efficacy of these models, particularly in large-scale 3D reconstruction, depends on the quality and coverage of the viewpoints. This paper addresses optimizing the trajectory of an unmanned aerial vehicle (UAV) to collect optimal Next-Best View (NBV) for 3D reconstruction models. Unlike traditional methods that rely on predefined criteria or continuous tracking of the 3D model’s development, our approach leverages reinforcement learning to select the NBV based solely on single camera images and the relative positions of the UAV with the reference points to a target. The UAV is positioned with respect to four reference waypoints at the structure’s corners, maintaining its orientation (field of view) towards the structure. Our approach removes the need for constant monitoring of 3D reconstruction accuracy during policy learning, ultimately boosting both the efficiency and autonomy of the data collection process. The implications of this research extend to applications in inspection, surveillance, and mapping, where optimal viewpoint selection is crucial for information gain and operational efficiency.more » « less
-
Vision-based sensing, when utilized in conjunction with camera-equipped unmanned aerial vehicles (UAVs), has recently emerged as an effective sensing technique in a variety of civil engineering applications (e.g., construction monitoring, conditional assessment, and post-disaster reconnaissance). However, the use of these non-intrusive sensing techniques for extracting the dynamic response of structures has been restricted due to the perspective and scale distortions or image misalignments caused by the movement of the UAV and its on-board camera during flight operations. To overcome these limitations, a vision-based analysis methodology is proposed in the present study for extracting the dynamic response of structures using unmanned aerial vehicle (UAV) aerial videos. Importantly, geo-referenced targets were strategically placed on the structures and the background (stationary) region to enhance the robustness and accuracy related to image feature detection. Image processing and photogrammetric techniques are adopted in the analysis procedures first to recover the camera motion using the world-to-image correspondences of the background (stationary) targets and subsequently to extract the dynamic structural response by reprojecting the image feature of the (moving) targets attached to the structures to the world coordinates. The displacement tracking results are validated using the responses of two full-scale test structures measured by analog displacement sensors during a sequence of shake table tests. The high level of precision (less than 3 mm root-mean-square errors) of the vision-based structural displacement results demonstrates the effectiveness of the proposed UAV displacement tracking methodology. Additionally, the limitations and potential solutions associated with the proposed methodology for monitoring the dynamic responses of real structures are discussed.more » « less
An official website of the United States government

