skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: BuMA: Non-Intrusive Breathing Detection using Microphone Array
Breath monitoring is important for monitoring illnesses, such as sleep apnea, for people of all ages. One cause of concern for parents is sudden infant death syndrome (SIDS), where an infant suddenly passes away during sleep, usually due to complications in breathing. There are a variety of works and products on the market for monitoring breathing, especially for children and infants. Many of these are wearables that require you to attach an accessory onto the child or person, which can be uncomfortable. Other solutions utilize a camera, which can be privacy-intrusive and function poorly during the night, when lighting is poor. In this work, we introduce BuMA, an audio-based, non-intrusive, and contactless, breathing monitoring system. BuMA utilizes a microphone array, beamforming, and audio filtering to enhance the sounds of breathing by filtering out several common noises in or near home environments, such as construction, speech, and music, that could make detection difficult. We show that BuMA improves breathing detection accuracy by up to 12%, within 30cm from a person, over existing audio filtering algorithms or platforms that do not leverage filtering.  more » « less
Award ID(s):
1815274 1704899
PAR ID:
10357213
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Proceedings of the 1st ACM International Workshop on Intelligent Acoustic Systems and Applications (IASA)
Page Range / eLocation ID:
1 to 6
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Smartphones and mobile applications have become an integral part of our daily lives. This is reflected by the increase in mobile devices, applications, and revenue generated each year. However, this growth is being met with an increasing concern for user privacy, and there have been many incidents of privacy and data breaches related to smartphones and mobile applications in recent years. In this work, we focus on improving privacy for audio-based mobile systems. These applications will generally listen to all sounds in the environment and may record privacy-sensitive signals, such as speech, that may not be needed for the application. We present PAMS, a software development package for mobile applications. PAMS integrates a novel sound source filtering algorithm called Probabilistic Template Matching to generate a set of privacy-enhancing filters that remove extraneous sounds using learned statistical "templates" of these sounds. We demonstrate the effectiveness of PAMS by integrating it into a sleep monitoring system, with the intent to remove extraneous speech from breathing, snoring, and other sleep sounds that the system is monitoring. By comparing our PAMS enhanced sleep monitoring system with existing mobile systems, we show that PAMS can reduce speech intelligibility by up to 74.3% while maintaining similar performance in detecting sleeping sounds. 
    more » « less
  2. null (Ed.)
    Breathing biomarkers, such as breathing rate, fractional inspiratory time, and inhalation-exhalation ratio, are vital for monitoring the user's health and well-being. Accurate estimation of such biomarkers requires breathing phase detection, i.e., inhalation and exhalation. However, traditional breathing phase monitoring relies on uncomfortable equipment, e.g., chestbands. Smartphone acoustic sensors have shown promising results for passive breathing monitoring during sleep or guided breathing. However, detecting breathing phases using acoustic data can be challenging for various reasons. One of the major obstacles is the complexity of annotating breathing sounds due to inaudible parts in regular breathing and background noises. This paper assesses the potential of using smartphone acoustic sensors for passive unguided breathing phase monitoring in a natural environment. We address the annotation challenges by developing a novel variant of the teacher-student training method for transferring knowledge from an inertial sensor to an acoustic sensor, eliminating the need for manual breathing sound annotation by fusing signal processing with deep learning techniques. We train and evaluate our model on the breathing data collected from 131 subjects, including healthy individuals and respiratory patients. Experimental results show that our model can detect breathing phases with 77.33% accuracy using acoustic sensors. We further present an example use-case of breathing phase-detection by first estimating the biomarkers from the estimated breathing phases and then using these biomarkers for pulmonary patient detection. Using the detected breathing phases, we can estimate fractional inspiratory time with 92.08% accuracy, the inhalation-exhalation ratio with 86.76% accuracy, and the breathing rate with 91.74% accuracy. Moreover, we can distinguish respiratory patients from healthy individuals with up to 76% accuracy. This paper is the first to show the feasibility of detecting regular breathing phases towards passively monitoring respiratory health and well-being using acoustic data captured by a smartphone. 
    more » « less
  3. Monitoring the compliance of social distancing is critical for schools and offices to recover in-person operations in indoor spaces from the COVID-19 pandemic. Existing systems focus on vision- and wearable-based sensing approaches, which require direct line-of-sight or device-carrying and may also raise privacy concerns. To overcome these limitations, we introduce a new monitoring system for social distancing compliance based on footstep-induced floor vibration sensing. This system is device-free, non-intrusive, and perceived as more privacy-friendly. Our system leverages the insight that footsteps closer to the sensors generate vibration signals with larger amplitudes. The system first estimates the location of each person relative to the sensors based on signal energy and then infers the distance between two people. We evaluated the system through a real-world experiment with 8 people, and the system achieves an average accuracy of 97.8% for walking scenario classification and 80.4% in social distancing violation detection. 
    more » « less
  4. The ability to assess sleep at home, capture sleep stages, and detect the occurrence of apnea (without on-body sensors) simply by analyzing the radio waves bouncing off people's bodies while they sleep is quite powerful. Such a capability would allow for longitudinal data collection in patients' homes, informing our understanding of sleep and its interaction with various diseases and their therapeutic responses, both in clinical trials and routine care. In this article, we develop an advanced machine learning algorithm for passively monitoring sleep and nocturnal breathing from radio waves reflected off people while asleep. Validation results in comparison with the gold standard (i.e., polysomnography) (n=849) demonstrate that the model captures the sleep hypnogram (with an accuracy of 81% for 30-second epochs categorized into Wake, Light Sleep, Deep Sleep, or REM), detects sleep apnea (AUROC = 0.88), and measures the patient's Apnea-Hypopnea Index (ICC=0.95; 95% CI = [0.93, 0.97]). Notably, the model exhibits equitable performance across race, sex, and age. Moreover, the model uncovers informative interactions between sleep stages and a range of diseases including neurological, psychiatric, cardiovascular, and immunological disorders. These findings not only hold promise for clinical practice and interventional trials but also underscore the significance of sleep as a fundamental component in understanding and managing various diseases. 
    more » « less
  5. Active sampling in the olfactory domain is an important aspect of mouse behaviour, and there is increasing evidence that respiration-entrained neural activity outside of the olfactory system sets an important global brain rhythm. It is therefore important to accurately measure breathing during natural behaviours. We develop a new approach to do this in freely moving animals, by implanting a telemetry-based pressure sensor into the right jugular vein, which allows for wireless monitoring of thoracic pressure. After verifying this technique against standard head-fixed respiration measurements, we combined it with EEG and EMG recording and used evolving partial coherence analysis to investigate the relationship between respiration and brain activity across a range of experiments in which the mice could move freely. During voluntary exploration of odours and objects, we found that the association between respiration and cortical delta and theta rhythms decreased, while the association between respiration and cortical alpha rhythm increased. During sleep, however, the presentation of an odour was able to cause a transient increase in sniffing without changing dominant sleep rhythms (delta and theta) in the cortex. Our data align with the emerging idea that the respiration rhythm could act as a synchronising scaffold for specific brain rhythms during wakefulness and exploration, but suggest that respiratory changes are less able to impact brain activity during sleep. Combining wireless respiration monitoring with different types of brain recording across a variety of behaviours will further increase our understanding of the important links between active sampling, passive respiration, and neural activity. 
    more » « less