Unmanned aerial vehicles (UAVs) rely on optical sensors such as cameras and lidar for autonomous operation. However, such optical sensors are error-prone in bad lighting, inclement weather conditions including fog and smoke, and around textureless or transparent surfaces. In this paper, we ask: is it possible to fly UAVs without relying on optical sensors, i.e., can UAVs fly without seeing? We present BatMobility, a lightweight mmWave radar-only perception system for UAVs that eliminates the need for optical sensors. BatMobility enables two core functionalities for UAVs – radio flow estimation (a novel FMCW radar-based alternative for optical flow based on surface-parallel doppler shift) and radar-based collision avoidance. We build BatMobility using commodity sensors and deploy it as a real-time system on a small off-the-shelf quadcopter running an unmodified flight controller. Our evaluation shows that BatMobility achieves comparable or better performance than commercial-grade optical sensors across a wide range of scenarios.
more »
« less
This content will become publicly available on June 3, 2025
Radarize: Enhancing Radar SLAM with Generalizable Doppler-Based Odometry
Millimeter-wave (mmWave) radar is increasingly being considered as an alternative to optical sensors for robotic primitives like simultaneous localization and mapping (SLAM). While mmWave radar overcomes some limitations of optical sensors, such as occlusions, poor lighting conditions, and privacy concerns, it also faces unique challenges, such as missed obstacles due to specular reflections or fake objects due to multipath. To address these challenges, we propose Radarize, a self-contained SLAM pipeline that uses only a commodity single-chip mmWave radar. Our radar-native approach uses techniques such as Doppler shift-based odometry and multipath artifact suppression to improve performance. We evaluate our method on a large dataset of 146 trajectories spanning 4 buildings and mounted on 3 different platforms, totaling approximately 4.7 Km of travel distance. Our results show that our method outperforms state-of-the-art radar and radar inertial approaches by approximately 5x in terms of odometry and 8x in terms of end-to end SLAM, as measured by absolute trajectory error (ATE), without the need for additional sensors such as IMUs or wheel encoders.
more »
« less
- Award ID(s):
- 2148583
- PAR ID:
- 10521355
- Publisher / Repository:
- ACM MobiSys 2024
- Date Published:
- ISBN:
- 9798400705816
- Page Range / eLocation ID:
- 331 to 344
- Format(s):
- Medium: X
- Location:
- Minato-ku, Tokyo Japan
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Using millimeter wave (mmWave) signals for imaging has an important advantage in that they can penetrate through poor environmental conditions such as fog, dust, and smoke that severely degrade optical-based imaging systems. However, mmWave radars, contrary to cameras and LiDARs, suffer from low angular resolution because of small physical apertures and conventional signal processing techniques. Sparse radar imaging, on the other hand, can increase the aperture size while minimizing the power consumption and read out bandwidth. This paper presents CoIR, an analysis by synthesis method that leverages the implicit neural network bias in convolutional decoders and compressed sensing to perform high accuracy sparse radar imaging. The proposed system is data set-agnostic and does not require any auxiliary sensors for training or testing. We introduce a sparse array design that allows for a 5.5× reduction in the number of antenna elements needed compared to conventional MIMO array designs. We demonstrate our system's improved imaging performance over standard mmWave radars and other competitive untrained methods on both simulated and experimental mmWave radar data.more » « less
-
null (Ed.)This work attempts to answer two problems. (1) Can we use the odometry information from two different Simultaneous Localization And Mapping (SLAM) algorithms to get a better estimate of the odometry? and (2) What if one of the SLAM algorithms gets affected by shot noise or by attack vectors, and can we resolve this situation? To answer the first question we focus on fusing odometries from Lidar-based SLAM and Visualbased SLAM using the Extended Kalman Filter (EKF) algorithm. The second question is answered by introducing the Maximum Correntropy Criterion - Extended Kalman Filter (MCC-EKF), which assists in removing/minimizing shot noise or attack vectors injected into the system. We manually simulate the shot noise and see how our system responds to the noise vectors. We also evaluate our approach on KITTI dataset for self-driving cars.more » « less
-
Dennison, Mark S. ; Krum, David M. ; Sanders-Reed, John ; Arthur, Jarvis (Ed.)This paper presents research concerning the use of visual-inertial Simultaneous Localization And Mapping (SLAM) algorithms to aid in Continuous Wave (CW) radar target mapping. SLAM is an established field in which radar has been used to internally contribute to the localization algorithms. Instead, the application in this case is to use SLAM outputs to localize radar data and construct three-dimensional target maps which can be viewed live in augmented reality. These methods are transferable to other types of radar units and sensors, but this paper presents the research showing how the methods can be applied to calculate depth efficiently with CW radar through triangulation using a Boolean intersection algorithm. Localization of the radar target is achieved through quaternion algebra. Due to the compact nature of the SLAM and CW devices, the radar unit can be operated entirely handheld. Targets are scanned in a free-form manner where there is no need to have a gridded scanning layout. The main advantage to this method is eliminating many hours of usage training and expertise, thereby eliminating ambiguity in the location, size and depth of buried or hidden targets. Additionally, this method grants the user the additional power, penetration and sensitivity of CW radar without the lack of range finding. Applications include pipe and buried structure location, avalanche rescue, structural health monitoring and historical site research.more » « less
-
The ubiquity of millimeter-wave (mmWave) technology could bring through-obstruction imaging to portable, mobile systems. Existing through-obstruction imaging systems rely on Synthetic Aperture Radar (SAR) technique, but emulating the SAR principle on hand-held devices has been challenging. We propose ViSAR, a portable platform that integrates an optical camera and mmWave radar to emulate the SAR principle and enable through-obstruction 3D imaging. ViSAR synchronizes the devices at the software-level and uses the Time Domain Backprojection algorithm to generate vision-augmented mmWave images. We have experimentally evaluated ViSAR by imaging several indoor objects.more » « less