skip to main content


Title: Improving Surface Current Resolution Using Direction Finding Algorithms for Multiantenna High-Frequency Radars
Abstract

While land-based high-frequency (HF) radars are the only instruments capable of resolving both the temporal and spatial variability of surface currents in the coastal ocean, recent high-resolution views suggest that the coastal ocean is more complex than presently deployed radar systems are able to reveal. This work uses a hybrid system, having elements of both phased arrays and direction finding radars, to improve the azimuthal resolution of HF radars. Data from two radars deployed along the U.S. East Coast and configured as 8-antenna grid arrays were used to evaluate potential direction finding and signal, or emitter, detection methods. Direction finding methods such as maximum likelihood estimation generally performed better than the well-known multiple signal classification (MUSIC) method given identical emitter detection methods. However, accurately estimating the number of emitters present in HF radar observations is a challenge. As MUSIC’s direction-of-arrival (DOA) function permits simple empirical tests that dramatically aid the detection process, MUSIC was found to be the superior method in this study. The 8-antenna arrays were able to provide more accurate estimates of MUSIC’s noise subspace than typical 3-antenna systems, eliminating the need for a series of empirical parameters to control MUSIC’s performance. Code developed for this research has been made available in an online repository.

 
more » « less
Award ID(s):
1658475 1657896 1831937 1736709
NSF-PAR ID:
10121038
Author(s) / Creator(s):
 ;  ;  ;  
Publisher / Repository:
American Meteorological Society
Date Published:
Journal Name:
Journal of Atmospheric and Oceanic Technology
Volume:
36
Issue:
10
ISSN:
0739-0572
Page Range / eLocation ID:
p. 1997-2014
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Previous work with simulations of oceanographic high-frequency (HF) radars has identified possible improvements when using maximum likelihood estimation (MLE) for direction of arrival; however, methods for determining the number of emitters (here defined as spatially distinct patches of the ocean surface) have not realized these improvements. Here we describe and evaluate the use of the likelihood ratio (LR) for emitter detection, demonstrating its application to oceanographic HF radar data. The combined detection–estimation methods MLE-LR are compared with multiple signal classification method (MUSIC) and MUSIC parameters for SeaSonde HF radars, along with a method developed for 8-channel systems known as MUSIC-Highest. Results show that the use of MLE-LR produces similar accuracy, in terms of the RMS difference and correlation coefficients squared, as previous methods. We demonstrate that improved accuracy can be obtained for both methods, at the cost of fewer velocity observations and decreased spatial coverage. For SeaSondes, accuracy improvements are obtained with less commonly used parameter sets. The MLE-LR is shown to be able to resolve simultaneous closely spaced emitters, which has the potential to improve observations obtained by HF radars operating in complex current environments. Significance Statement We identify and test a method based on the likelihood ratio (LR) for determining the number of signal sources in observations subject to direction finding with maximum likelihood estimation (MLE). Direction-finding methods are used in broad-ranging applications that include radar, sonar, and wireless communication. Previous work suggests accuracy improvements when using MLE, but suitable methods for determining the number of simultaneous signal sources are not well known. Our work shows that the LR, when combined with MLE, performs at least as well as alternative methods when applied to oceanographic high-frequency (HF) radars. In some situations, MLE and LR obtain superior resolution, where resolution is defined as the ability to distinguish closely spaced signal sources. 
    more » « less
  2. In this paper we derive a new capability for robots to measure relative direction, or Angle-of-Arrival (AOA), to other robots, while operating in non-line-of-sight and unmapped environments, without requiring external infrastructure. We do so by capturing all of the paths that a WiFi signal traverses as it travels from a transmitting to a receiving robot in the team, which we term as an AOA profile. The key intuition behind our approach is to emulate antenna arrays in the air as a robot moves freely in 2D or 3D space. The small differences in the phase and amplitude of WiFi signals are thus processed with knowledge of a robots’ local displacements (often provided via inertial sensors) to obtain the profile, via a method akin to Synthetic Aperture Radar (SAR). The main contribution of this work is the development of i) a framework to accommodate arbitrary 2D and 3D trajectories, as well as continuous mobility of both transmitting and receiving robots, while computing AOA profiles between them and ii) an accompanying analysis that provides a lower bound on variance of AOA estimation as a function of robot trajectory geometry that is based on the Cramer Rao Bound and antenna array theory. This is a critical distinction with previous work on SAR that restricts robot mobility to prescribed motion patterns, does not generalize to the full 3D space, and/or requires transmitting robots to be static during data acquisition periods. In fact, we find that allowing robots to use their full mobility in 3D space while performing SAR, results in more accurate AOA profiles and thus better AOA estimation. We formally characterize this observation as the informativeness of the trajectory; a computable quantity for which we derive a closed form. All theoretical developments are substantiated by extensive simulation and hardware experiments on air/ground robot platforms. Our experimental results bolster our theoretical findings, demonstrating that 3D trajectories provide enhanced and consistent accuracy, with AOA error of less than 10 deg for 95% of trials. We also show that our formulation can be used with an off-the-shelf trajectory estimation sensor (Intel RealSense T265 tracking camera), for estimating the robots’ local displacements, and we provide theoretical as well as empirical results that show the impact of typical trajectory estimation errors on the measured AOA. Finally, we demonstrate the performance of our system on a multi-robot task where a heterogeneous air/ground pair of robots continuously measure AOA profiles over a WiFi link to achieve dynamic rendezvous in an unmapped, 300 square meter environment with occlusions. 
    more » « less
  3. Using millimeter wave (mmWave) signals for imaging has an important advantage in that they can penetrate through poor environmental conditions such as fog, dust, and smoke that severely degrade optical-based imaging systems. However, mmWave radars, contrary to cameras and LiDARs, suffer from low angular resolution because of small physical apertures and conventional signal processing techniques. Sparse radar imaging, on the other hand, can increase the aperture size while minimizing the power consumption and read out bandwidth. This paper presents CoIR, an analysis by synthesis method that leverages the implicit neural network bias in convolutional decoders and compressed sensing to perform high accuracy sparse radar imaging. The proposed system is data set-agnostic and does not require any auxiliary sensors for training or testing. We introduce a sparse array design that allows for a 5.5× reduction in the number of antenna elements needed compared to conventional MIMO array designs. We demonstrate our system's improved imaging performance over standard mmWave radars and other competitive untrained methods on both simulated and experimental mmWave radar data. 
    more » « less
  4. In this paper, we develop the analytical framework for a novel Wireless signal-based Sensing capability for Robotics (WSR) by leveraging a robots’ mobility in 3D space. It allows robots to primarily measure relative direction, or Angle-of-Arrival (AOA), to other robots, while operating in non-line-of-sight unmapped environments and without requiring external infrastructure. We do so by capturing all of the paths that a wireless signal traverses as it travels from a transmitting to a receiving robot in the team, which we term as an AOA profile. The key intuition behind our approach is to enable a robot to emulate antenna arrays as it moves freely in 2D and 3D space. The small differences in the phase of the wireless signals are thus processed with knowledge of robots’ local displacement to obtain the profile, via a method akin to Synthetic Aperture Radar (SAR). The main contribution of this work is the development of (i) a framework to accommodate arbitrary 2D and 3D motion, as well as continuous mobility of both signal transmitting and receiving robots, while computing AOA profiles between them and (ii) a Cramer–Rao Bound analysis, based on antenna array theory, that provides a lower bound on the variance in AOA estimation as a function of the geometry of robot motion. This is a critical distinction with previous work on SAR-based methods that restrict robot mobility to prescribed motion patterns, do not generalize to the full 3D space, and require transmitting robots to be stationary during data acquisition periods. We show that allowing robots to use their full mobility in 3D space while performing SAR results in more accurate AOA profiles and thus better AOA estimation. We formally characterize this observation as the informativeness of the robots’ motion, a computable quantity for which we derive a closed form. All analytical developments are substantiated by extensive simulation and hardware experiments on air/ground robot platforms using 5 GHz WiFi. Our experimental results bolster our analytical findings, demonstrating that 3D motion provides enhanced and consistent accuracy, with a total AOA error of less than 10for 95% of trials. We also analytically characterize the impact of displacement estimation errors on the measured AOA and validate this theory empirically using robot displacements obtained using an off-the-shelf Intel Tracking Camera T265. Finally, we demonstrate the performance of our system on a multi-robot task where a heterogeneous air/ground pair of robots continuously measure AOA profiles over a WiFi link to achieve dynamic rendezvous in an unmapped, 300 m2environment with occlusions.

     
    more » « less
  5. null (Ed.)
    Emerging autonomous driving systems require reliable perception of 3D surroundings. Unfortunately, current mainstream perception modalities, i.e., camera and Lidar, are vulnerable under challenging lighting and weather conditions. On the other hand, despite their all-weather operations, today's vehicle Radars are limited to location and speed detection. In this paper, we introduce MILLIPOINT, a practical system that advances the Radar sensing capability to generate 3D point clouds. The key design principle of MILLIPOINT lies in enabling synthetic aperture radar (SAR) imaging on low-cost commodity vehicle Radars. To this end, MILLIPOINT models the relation between signal variations and Radar movement, and enables self-tracking of Radar at wavelength-scale precision, thus realize coherent spatial sampling. Furthermore, MILLIPOINT solves the unique problem of specular reflection, by properly focusing on the targets with post-imaging processing. It also exploits the Radar's built-in antenna array to estimate the height of reflecting points, and eventually generate 3D point clouds. We have implemented MILLIPOINT on a commodity vehicle Radar. Our evaluation results show that MILLIPOINT effectively combats motion errors and specular reflections, and can construct 3D point clouds with much higher density and resolution compared with the existing vehicle Radar solutions. 
    more » « less