- Award ID(s):
- 1951880
- NSF-PAR ID:
- 10439719
- Date Published:
- Journal Name:
- IEEE Radar conference
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
While radio frequency (RF) based respiration monitoring for at- home health screening is receiving increasing attention, robustness remains an open challenge. In recent work, deep learning (DL) methods have been demonstrated effective in dealing with non- linear issues from multi-path interference to motion disturbance, thus improving the accuracy of RF-based respiration monitoring. However, such DL methods usually require large amounts of train- ing data with intensive manual labeling efforts, and frequently not openly available. We propose RF-Q for robust RF-based respiration monitoring, using self-supervised learning with an autoencoder (AE) neural network to quantify the quality of respiratory signal based on the residual between the original and reconstructed sig- nals. We demonstrate that, by simply quantifying the signal quality with AE for weighted estimation we can boost the end-to-end (e2e) respiration monitoring accuracy by an improvement ratio of 2.75 compared to a baseline.more » « less
-
Continuous monitoring of respiration provides invaluable insights about health status management (e.g., the progression or recovery of diseases). Recent advancements in radio frequency (RF) technologies show promise for continuous respiration monitoring by virtue of their non-invasive nature, and preferred over wearable solutions that require frequent charging and continuous wearing. However, RF signals are susceptible to large body movements, which are inevitable in real life, challenging the robustness of respiration monitoring. While many existing methods have been proposed to achieve robust RF-based respiration monitoring, their reliance on supervised data limits their potential for broad applicability. In this context, we propose, RF-Q, an unsupervised/self-supervised model to achieve signal quality assessment and quality-aware estimation for robust RF-based respiration monitoring. RF-Q uses the recon- struction error of an autoencoder (AE) neural network to quantify the quality of respiratory information in RF signals without the need for data labeling. With the combination of the quantified sig- nal quality and reconstructed signal in a weighted fusion, we are able to achieve improved robustness of RF respiration monitor- ing. We demonstrate that, instead of applying sophisticated models devised with respective expertise using a considerable amount of labeled data, by just quantifying the signal quality in an unsupervised manner we can significantly boost the average end-to-end (e2e) respiratory rate estimation accuracy of a baseline by an improvement ratio of 2.75, higher than the gain of 1.94 achieved by a supervised baseline method that excludes distorted data.more » « less
-
Many people listen to music for hours every day, often near bedtime. We investigated whether music listening affects sleep, focusing on a rarely explored mechanism: involuntary musical imagery (earworms). In Study 1 ( N = 199, mean age = 35.9 years), individuals who frequently listen to music reported persistent nighttime earworms, which were associated with worse sleep quality. In Study 2 ( N = 50, mean age = 21.2 years), we randomly assigned each participant to listen to lyrical or instrumental-only versions of popular songs before bed in a laboratory, discovering that instrumental music increased the incidence of nighttime earworms and worsened polysomnography-measured sleep quality. In both studies, earworms were experienced during awakenings, suggesting that the sleeping brain continues to process musical melodies. Study 3 substantiated this possibility by showing a significant increase in frontal slow oscillation activity, a marker of sleep-dependent memory consolidation. Thus, some types of music can disrupt nighttime sleep by inducing long-lasting earworms that are perpetuated by spontaneous memory-reactivation processes.
-
Recent advances in cyber-physical systems, artificial intelligence, and cloud computing have driven the wide deployments of Internet-of-things (IoT) in smart homes. As IoT devices often directly interact with the users and environments, this paper studies if and how we could explore the collective insights from multiple heterogeneous IoT devices to infer user activities for home safety monitoring and assisted living. Specifically, we develop a new system, namely IoTMosaic, to first profile diverse user activities with distinct IoT device event sequences, which are extracted from smart home network traffic based on their TCP/IP data packet signatures. Given the challenges of missing and out-of-order IoT device events due to device malfunctions or varying network and system latencies, IoTMosaic further develops simple yet effective approximate matching algorithms to identify user activities from real-world IoT network traffic. Our experimental results on thousands of user activities in the smart home environment over two months show that our proposed algorithms can infer different user activities from IoT network traffic in smart homes with the overall accuracy, precision, and recall of 0.99, 0.99, and 1.00, respectively.more » « less
-
This demonstration presents a working prototype of VitalHub, a practical solution for longitudinal in-home vital signs monitoring. To balance the trade-offs between the challenges related to an individual’s efforts thus compliance, and robustness with vital signs monitoring, we introduce a passive monitoring solution, which is free of any on-body device or cooperative efforts from the user. By fusing the inputs from a pair of co-located UWB and depth sensors, VitalHub achieves robust, passive, context-aware and privacy-preserving sensing. We use a COTS UWB sensor to detect chest wall displacement due to the respiration and heartbeat for vital signs extraction. We use the depth information from Microsoft Kinect to detect and locate the users in the field of view and recognize the activities of the respective users for further analysis. We have tested the prototype extensively in engineering and medical lab environments. We will demonstrate the features and performance of VitalHub using real-world data in comparison with an FDA approved medical device.more » « less