skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 2215082

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Radars are widely adopted for autonomous navigation and vehicular networking due to their robustness to weather conditions as compared to visible light cameras and lidars. However, radars currently struggle with differentiating static vs tangentially moving objects within a single radar frame since both yield the same Doppler along line-of-sight paths to the radar. Prior solutions deploy multiple radar or visible light camera modules to form a multi-“look” synthetic aperture for estimating the single-frame velocity vectors, to estimate tangential and radial velocity components of moving objects leading to higher system costs. In this paper, we propose to exploit multi-bounce scattering from secondary static objects in the environment, e.g., building pillars, walls, etc., to form an effective multi-“look” synthetic aperture for single-frame velocity vector estimation with a single multiple-input, multiple-output (MIMO) radar, thus reducing the overall system cost and removing the need for multi-module synchronization. We present a comprehensive theoretical and experiment evaluation of our scheme, demonstrating a 4.5× reduction in the error for estimating moving objects’ velocity vectors over comparable single-radar baselines. 
    more » « less
    Free, publicly-accessible full text available July 30, 2026
  2. We explore synthetic aperture radar (SAR) 3D imaging capabilities in the sub-THz band using a novel 110-260 GHz experimental testbed. We propose RADar Implicit SHapes (RADISH), a post-processing method that leverages the SAR’s millimeter-level imaging resolution to estimate an object’s 3D shape. RADISH first c onverts high-resolution SAR images to a detailed point cloud of a scanned object. The point cloud is then used to fit an implicit neural representation to the object’s surface by approximating the signed distance function of the scene. We experimentally validate RADISH’s ability to represent the salient geometric features of real-world 3D objects. 
    more » « less
    Free, publicly-accessible full text available June 2, 2026
  3. An emerging application of wireless sensing is locating and tracking humans in their living environments, a primitive that can be leveraged in both daily life applications and emergency situations. However, most proposed methods have limited spatial resolution when multiple humans are in close vicinity. The problem becomes exacerbated when there is no line-of-sight path to the humans. In this paper, we consider multi-person localization of humans in close vicinity of each other. We propose the use of synthetic aperture radar that combines both translation and rotation to increase effective aperture size, leveraging small rhythmic changes in the radar range due to human breathing. We experimentally evaluate the proposed algorithm in both line-of-sight and through-wall cases with three to five humans in the scene. Our experimental results show that: (i) larger synthetic apertures due to radar translation improve multi-person localization, e.g., by 1.42× when the aperture size is increased by a factor of 2×, and (ii) rotation can largely compensate for gains provided by translation, e.g., rotating the radar over 360° without changing the aperture size results in 1.22× gains over no rotation. Overall, maximal gains of 2.19× are achieved by rotating and translating over a 2× larger aperture. 
    more » « less
    Free, publicly-accessible full text available March 7, 2026
  4. In this paper, we ask, "Can millimeter-wave (mmWave) radars sense objects not directly illuminated by the radar - for instance, objects located outside the transmit beamwidth, behind occlusions, or placed fully behind the radar?" Traditionally, mmWave radars are limited to sense objects that are directly illuminated by the radar and scatter its signals directly back. In practice, however, radar signals scatter to other intermediate objects in the environment and undergo multiple bounces before being received back at the radar. In this paper, we present Hydra, a framework to explicitly model and exploit multi-bounce paths for sensing. Hydra enables standalone mmWave radars to sense beyond-field-of-view objects without prior knowledge of the environment. We extensively evaluate the localization performance of Hydra with an off-the-shelf mmWave radar in five different environments with everyday objects. Exploiting multi-bounce via Hydra provides 2×-10× improvement in the median beyond-field-of-view localization error over baselines. 
    more » « less