skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 2239569

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Smart clothing has exhibited impressive body pose/movement tracking capabilities while preserving the soft, comfortable, and familiar nature of clothing. For practical everyday use, smart clothing should (1) be available in a range of sizes to accommodate different fit preferences, and (2) be washable to allow repeated use. In SeamFit, we demonstrate washable T-shirts, embedded with capacitive seam electrodes, available in three different sizes, for exercise logging. Our T-shirt design, customized signal processing & machine learning pipeline allow the SeamFit system to generalize across users, fits, and wash cycles. Prior wearable exercise logging solutions, which often attach a miniaturized sensor to a body location, struggle to track exercises that mainly involve other body parts. SeamFit T-shirt naturally covers a large area of the body and still tracks exercises that mainly involve uncovered joints (e.g., elbows and the lower body). In a user study with 15 participants performing 14 exercises, SeamFit detects exercises with an accuracy of 89%, classifies exercises with an accuracy of 93.4%, and counts exercises with an error of 0.9 counts, on average. SeamFit is a step towards practical smart clothing for everyday uses. 
    more » « less
    Free, publicly-accessible full text available March 3, 2026
  2. Smart glasses have become more prevalent as they provide an increasing number of applications for users. They store various types of private information or can access it via connections established with other devices. Therefore, there is a growing need for user identification on smart glasses. In this paper, we introduce a low-power and minimally-obtrusive system called SonicID, designed to authenticate users on glasses. SonicID extracts unique biometric information from users by scanning their faces with ultrasonic waves and utilizes this information to distinguish between different users, powered by a customized binary classifier with the ResNet-18 architecture. SonicID can authenticate users by scanning their face for 0.06 seconds. A user study involving 40 participants confirms that SonicID achieves a true positive rate of 97.4%, a false positive rate of 4.3%, and a balanced accuracy of 96.6% using just 1 minute of training data collected for each new user. This performance is relatively consistent across different remounting sessions and days. Given this promising performance, we further discuss the potential applications of SonicID and methods to improve its performance in the future. 
    more » « less
    Free, publicly-accessible full text available November 21, 2025
  3. We present ActSonic, an intelligent, low-power active acoustic sensing system integrated into eyeglasses that can recognize 27 different everyday activities (e.g., eating, drinking, toothbrushing) from inaudible acoustic waves around the body. It requires only a pair of miniature speakers and microphones mounted on each hinge of the eyeglasses to emit ultrasonic waves, creating an acoustic aura around the body. The acoustic signals are reflected based on the position and motion of various body parts, captured by the microphones, and analyzed by a customized self-supervised deep learning framework to infer the performed activities on a remote device such as a mobile phone or cloud server. ActSonic was evaluated in user studies with 19 participants across 19 households to track its efficacy in everyday activity recognition. Without requiring any training data from new users (leave-one-participant-out evaluation), ActSonic detected 27 activities, achieving an average F1-score of 86.6% in fully unconstrained scenarios and 93.4% in prompted settings at participants' homes. 
    more » « less
    Free, publicly-accessible full text available November 21, 2025
  4. We present Ring-a-Pose, a single untethered ring that tracks continuous 3D hand poses. Located in the center of the hand, the ring emits an inaudible acoustic signal that each hand pose reflects differently. Ring-a-Pose imposes minimal obtrusions on the hand, unlike multi-ring or glove systems. It is not affected by the choice of clothing that may cover wrist-worn systems. In a series of three user studies with a total of 36 participants, we evaluate Ring-a-Pose's performance on pose tracking and micro-finger gesture recognition. Without collecting any training data from a user, Ring-a-Pose tracks continuous hand poses with a joint error of 14.1mm. The joint error decreases to 10.3mm for fine-tuned user-dependent models. Ring-a-Pose recognizes 7-class micro-gestures with a 90.60% and 99.27% accuracy for user-independent and user-dependent models, respectively. Furthermore, the ring exhibits promising performance when worn on any finger. Ring-a-Pose enables the future of smart rings to track and recognize hand poses using relatively low-power acoustic sensing. 
    more » « less
    Free, publicly-accessible full text available November 21, 2025