skip to main content


Title: DREAM: Domain Invariant and Contrastive Representation for Sleep Dynamics
Abstract—Sleep staging is a key challenge in diagnosing and treating sleep-related diseases due to its labor-intensive, time- consuming, costly, and error-prone. With the availability of large- scale sleep signal data, many deep learning methods are proposed for automatic sleep staging. However, these existing methods face several challenges including the heterogeneity of patients’ underlying health conditions and the difficulty modeling complex interactions between sleep stages. In this paper, we propose a neural network architecture named DREAM to tackle these is- sues for automatic sleep staging. DREAM consists of (i) a feature representation network that generates robust representations for sleep signals via the variational auto-encoder framework and contrastive learning and (ii) a sleep stage classification network that explicitly models the interactions between sleep stages in the sequential context at both feature representation and label classification levels via Transformer and conditional random field architectures. Our experimental results indicate that DREAM significantly outperforms existing methods for automatic sleep staging on three sleep signal datasets.  more » « less
Award ID(s):
2037398 2145625
NSF-PAR ID:
10403016
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
IEEE International Conference on Data Mining
Page Range / eLocation ID:
1029 to 1034
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Memories of waking-life events are incorporated into dreams, but their incorporation is not uniform across a night of sleep. This study aimed to elucidate ways in which such memory sources vary by sleep stage and time of night. Twenty healthy participants (11 F; 24.1 ± 5.7 years) spent a night in the laboratory and were awakened for dream collection approximately 12 times spread across early, middle, and late periods of sleep, while covering all stages of sleep (N1, N2, N3, REM). In the morning, participants identified and dated associated memories of waking-life events for each dream report, when possible. The incorporation of recent memory sources in dreams was more frequent in N1 and REM than in other sleep stages. The incorporation of distant memories from over a week ago, semantic memories not traceable to a single event, and anticipated future events remained stable throughout sleep. In contrast, the relative proportions of recent versus distant memory sources changed across the night, independently of sleep stage, with late-night dreams in all stages having relatively less recent and more remote memory sources than dreams earlier in the night. Qualitatively, dreams tended to repeat similar themes across the night and in different sleep stages. The present findings clarify the temporal course of memory incorporations in dreams, highlighting a specific connection between time of night and the temporal remoteness of memories. We discuss how dream content may, at least in part, reflect the mechanisms of sleep-dependent memory consolidation.

     
    more » « less
  2. Heart rate variability (HRV) features support several clinical applications, including sleep staging, and ballistocardiograms (BCGs) can be used to unobtrusively estimate these features. Electrocardiography is the traditional clinical standard for HRV estimation, but BCGs and electrocardiograms (ECGs) yield different estimates for heartbeat intervals (HBIs), leading to differences in calculated HRV parameters. This study examines the viability of using BCG-based HRV features for sleep staging by quantifying the impact of these timing differences on the resulting parameters of interest. We introduced a range of synthetic time offsets to simulate the differences between BCG- and ECG-based heartbeat intervals, and the resulting HRV features are used to perform sleep staging. Subsequently, we draw a relationship between the mean absolute error in HBIs and the resulting sleep-staging performances. We also extend our previous work in heartbeat interval identification algorithms to demonstrate that our simulated timing jitters are close representatives of errors between heartbeat interval measurements. This work indicates that BCG-based sleep staging can produce accuracies comparable to ECG-based techniques such that at an HBI error range of up to 60 ms, the sleep-scoring error could increase from 17% to 25% based on one of the scenarios we examined.

     
    more » « less
  3. Image-based breast tumor classification is an active and challenging problem. In this paper, a robust breast tumor classification framework is presented based on deep feature representation learning and exploiting available information in existing samples. Feature representation learning of mammograms is fulfilled by a modified nonnegative matrix factorization model called LPML-LRNMF, which is motivated by hierarchical learning and layer-wise pre-training (LP) strategy in deep learning. Low-rank (LR) constraint is integrated into the feature representation learning model by considering 
    more » « less
  4. Spatial classification with limited feature observations has been a challenging problem in machine learning. The problem exists in applications where only a subset of sensors are deployed at certain regions or partial responses are collected in field surveys. Existing research mostly focuses on addressing incomplete or missing data, e.g., data cleaning and imputation, classification models that allow for missing feature values, or modeling missing features as hidden variables and applying the EM algorithm. These methods, however, assume that incomplete feature observations only happen on a small subset of samples, and thus cannot solve problems where the vast majority of samples have missing feature observations. To address this issue, we propose a new approach that incorporates physics-aware structural constraints into the model representation. Our approach assumes that a spatial contextual feature is observed for all sample locations and establishes spatial structural constraint from the spatial contextual feature map. We design efficient algorithms for model parameter learning and class inference. Evaluations on real-world hydrological applications show that our approach significantly outperforms several baseline methods in classification accuracy, and the proposed solution is computationally efficient on a large data volume. 
    more » « less
  5. Noise and inconsistency commonly exist in real-world information networks, due to the inherent error-prone nature of human or user privacy concerns. To date, tremendous efforts have been made to advance feature learning from networks, including the most recent graph convolutional networks (GCNs) or attention GCN, by integrating node content and topology structures. However, all existing methods consider networks as error-free sources and treat feature content in each node as independent and equally important to model node relations. Noisy node content, combined with sparse features, provides essential challenges for existing methods to be used in real-world noisy networks. In this article, we propose feature-based attention GCN (FA-GCN), a feature-attention graph convolution learning framework, to handle networks with noisy and sparse node content. To tackle noise and sparse content in each node, FA-GCN first employs a long short-term memory (LSTM) network to learn dense representation for each node feature. To model interactions between neighboring nodes, a feature-attention mechanism is introduced to allow neighboring nodes to learn and vary feature importance, with respect to their connections. By using a spectral-based graph convolution aggregation process, each node is allowed to concentrate more on the most determining neighborhood features aligned with the corresponding learning task. Experiments and validations, w.r.t. different noise levels, demonstrate that FA-GCN achieves better performance than the state-of-the-art methods in both noise-free and noisy network environments. 
    more » « less