skip to main content

This content will become publicly available on August 26, 2023

Title: A wireless signal-based sensing framework for robotics

In this paper, we develop the analytical framework for a novel Wireless signal-based Sensing capability for Robotics (WSR) by leveraging a robots’ mobility in 3D space. It allows robots to primarily measure relative direction, or Angle-of-Arrival (AOA), to other robots, while operating in non-line-of-sight unmapped environments and without requiring external infrastructure. We do so by capturing all of the paths that a wireless signal traverses as it travels from a transmitting to a receiving robot in the team, which we term as an AOA profile. The key intuition behind our approach is to enable a robot to emulate antenna arrays as it moves freely in 2D and 3D space. The small differences in the phase of the wireless signals are thus processed with knowledge of robots’ local displacement to obtain the profile, via a method akin to Synthetic Aperture Radar (SAR). The main contribution of this work is the development of (i) a framework to accommodate arbitrary 2D and 3D motion, as well as continuous mobility of both signal transmitting and receiving robots, while computing AOA profiles between them and (ii) a Cramer–Rao Bound analysis, based on antenna array theory, that provides a lower bound on the variance in AOA more » estimation as a function of the geometry of robot motion. This is a critical distinction with previous work on SAR-based methods that restrict robot mobility to prescribed motion patterns, do not generalize to the full 3D space, and require transmitting robots to be stationary during data acquisition periods. We show that allowing robots to use their full mobility in 3D space while performing SAR results in more accurate AOA profiles and thus better AOA estimation. We formally characterize this observation as the informativeness of the robots’ motion, a computable quantity for which we derive a closed form. All analytical developments are substantiated by extensive simulation and hardware experiments on air/ground robot platforms using 5 GHz WiFi. Our experimental results bolster our analytical findings, demonstrating that 3D motion provides enhanced and consistent accuracy, with a total AOA error of less than 10for 95% of trials. We also analytically characterize the impact of displacement estimation errors on the measured AOA and validate this theory empirically using robot displacements obtained using an off-the-shelf Intel Tracking Camera T265. Finally, we demonstrate the performance of our system on a multi-robot task where a heterogeneous air/ground pair of robots continuously measure AOA profiles over a WiFi link to achieve dynamic rendezvous in an unmapped, 300 m2environment with occlusions.

« less
Authors:
 ;  ;  ;  ;  ;  
Publication Date:
NSF-PAR ID:
10373448
Journal Name:
The International Journal of Robotics Research
Volume:
41
Issue:
11-12
Page Range or eLocation-ID:
p. 955-992
ISSN:
0278-3649
Publisher:
SAGE Publications
Sponsoring Org:
National Science Foundation
More Like this
  1. In this paper we derive a new capability for robots to measure relative direction, or Angle-of-Arrival (AOA), to other robots, while operating in non-line-of-sight and unmapped environments, without requiring external infrastructure. We do so by capturing all of the paths that a WiFi signal traverses as it travels from a transmitting to a receiving robot in the team, which we term as an AOA profile. The key intuition behind our approach is to emulate antenna arrays in the air as a robot moves freely in 2D or 3D space. The small differences in the phase and amplitude of WiFi signals are thus processed with knowledge of a robots’ local displacements (often provided via inertial sensors) to obtain the profile, via a method akin to Synthetic Aperture Radar (SAR). The main contribution of this work is the development of i) a framework to accommodate arbitrary 2D and 3D trajectories, as well as continuous mobility of both transmitting and receiving robots, while computing AOA profiles between them and ii) an accompanying analysis that provides a lower bound on variance of AOA estimation as a function of robot trajectory geometry that is based on the Cramer Rao Bound and antenna array theory. Thismore »is a critical distinction with previous work on SAR that restricts robot mobility to prescribed motion patterns, does not generalize to the full 3D space, and/or requires transmitting robots to be static during data acquisition periods. In fact, we find that allowing robots to use their full mobility in 3D space while performing SAR, results in more accurate AOA profiles and thus better AOA estimation. We formally characterize this observation as the informativeness of the trajectory; a computable quantity for which we derive a closed form. All theoretical developments are substantiated by extensive simulation and hardware experiments on air/ground robot platforms. Our experimental results bolster our theoretical findings, demonstrating that 3D trajectories provide enhanced and consistent accuracy, with AOA error of less than 10 deg for 95% of trials. We also show that our formulation can be used with an off-the-shelf trajectory estimation sensor (Intel RealSense T265 tracking camera), for estimating the robots’ local displacements, and we provide theoretical as well as empirical results that show the impact of typical trajectory estimation errors on the measured AOA. Finally, we demonstrate the performance of our system on a multi-robot task where a heterogeneous air/ground pair of robots continuously measure AOA profiles over a WiFi link to achieve dynamic rendezvous in an unmapped, 300 square meter environment with occlusions.« less
  2. In this work, we present a battery-less wireless Micro-Electro-Mechanical (MEMS)-based respiration sensor capable of measuring the respiration profile of a human subject from up to 2 m distance from the transceiver unit for a mean excitation power of 80 µW and a measured SNR of 124.8 dB at 0.5 m measurement distance. The sensor with a footprint of ~10 cm2 is designed to be inexpensive, maximize user mobility, and cater to applications where disposability is desirable to minimize the sanitation burden. The sensing system is composed of a custom UHF RFID antenna, a low-loss piezoelectric MEMS resonator with two modes within the frequency range of interest, and a base transceiver unit. The difference in temperature and moisture content of inhaled and exhaled air modulates the resonance frequency of the MEMS resonator which in turn is used to monitor respiration. To detect changes in the resonance frequency of the MEMS devices, the sensor is excited by a pulsed sinusoidal signal received through an external antenna directly coupled to the device. The signal reflected from the device through the antenna is then analyzed via Fast Fourier Transform (FFT) to extract and monitor the resonance frequency of the resonator. By tracking the resonancemore »frequency over time, the respiration profile of a patient is tracked. A compensation method for the removal of motion-induced artifacts and drift is proposed and implemented using the difference in the resonance frequency of two resonance modes of the same resonator.« less
  3. Unmanned Aerial Vehicles (UAVs) often lack the size, weight, and power to support large antenna arrays or a large number of radio chains. Despite such limitations, emerging applications that require the use of swarms, where UAVs form a pattern and coordinate towards a common goal, must have the capability to transmit in any direction in three-dimensional (3D) space from moment to moment. In this work, we design a measurement study to evaluate the role of antenna polarization diversity on UAV systems communicating in arbitrary 3D space. To do so, we construct flight patterns where one transmitting UAV is hovering at a high altitude (80 m) and a receiving UAV hovers at 114 different positions that span 3D space at a radial distance of approximately 20 m along equally-spaced elevation and azimuth angles. To understand the role of diverse antenna polarizations, both UAVs have a horizontally-mounted antenna and a vertically-mounted antenna-each attached to a dedicated radio chain-creating four wireless channels. With this measurement campaign, we seek to understand how to optimally select an antenna orientation and quantify the gains in such selections.
  4. Underwater motion recognition using acoustic wireless networks has a promisingly potential to be applied to the diver activity monitoring and aquatic animal recognition without the burden of expensive underwater cameras which have been used by the image-based underwater classification techniques. However, accurately extracting features that are independent of the complicated underwater environments such as inhomogeneous deep seawater is a serious challenge for underwater motion recognition. Velocities of target body (VTB) during the motion are excellent environment independent features for WiFi-based recognition techniques in the indoor environments, however, VTB features are hard to be extracted accurately in the underwater environments. The inaccurate VTB estimation is caused by the fact that the signal propagates along with a curve instead of a straight line as the signal propagates in the air. In this paper, we propose an underwater motion recognition mechanism in the inhomogeneous deep seawater using acoustic wireless networks. To accurately extract velocities of target body features, we first derive Doppler Frequency Shift (DFS) coefficients that can be utilized for VTB estimation when signals propagate deviously. Secondly, we propose a dynamic self-refining (DSR) optimization algorithm with acoustic wireless networks that consist of multiple transmitter-receiver links to estimate the VTB. Those VTB featuresmore »can be utilized to train the convolutional neural networks (CNN). Through the simulation, estimated VTB features are evaluated and the testing recognition results validate that our proposed underwater motion recognition mechanism is able to achieve high classification accuracy.« less
  5. In the next wave of swarm-based applications, unmanned aerial vehicles (UAVs) need to communicate with peer drones in any direction of a three-dimensional (3D) space. On a given drone and across drones, various antenna positions and orientations are possible. We know that, in free space, high levels of signal loss are expected if the transmitting and receiving antennas are cross polarized. However, increasing the reflective and scattering objects in the channel between a transmitter and receiver can cause the received polarization to become completely independent from the transmitted polarization, making the cross-polarization of antennas insignificant. Usually, these effects are studied in the context of cellular and terrestrial networks and have not been analyzed when those objects are the actual bodies of the communicating drones that can take different relative directions or move at various elevations. In this work, we show that the body of the drone can affect the received power across various antenna orientations and positions and act as a local scatterer that increases channel depolarization, reducing the cross-polarization discrimination (XPD). To investigate these effects, we perform experimentation that is staged in terms of complexity from a controlled environment of an anechoic chamber with and without drone bodies tomore »in-field environments where drone-mounted antennas are in-flight with various orientations and relative positions with the following outcomes: (i.) drone relative direction can significantly impact the XPD values, (ii.) elevation angle is a critical factor in 3D link performance, (iii.) antenna spacing requirements are altered for co-located cross-polarized antennas, and (iv.) cross-polarized antenna setups more than double spectral efficiency. Our results can serve as a guide for accurately simulating and modeling UAV networks and drone swarms.« less