skip to main content

Title: OpiTrack: A Wearable-based Clinical Opioid Use Tracker with Temporal Convolutional Attention Networks
Opioid use disorder is a medical condition with major social and economic consequences. While ubiquitous physiological sensing technologies have been widely adopted and extensively used to monitor day-to-day activities and deliver targeted interventions to improve human health, the use of these technologies to detect drug use in natural environments has been largely underexplored. The long-term goal of our work is to develop a mobile technology system that can identify high-risk opioid-related events (i.e., development of tolerance in the setting of prescription opioid use, return-to-use events in the setting of opioid use disorder) and deploy just-in-time interventions to mitigate the risk of overdose morbidity and mortality. In the current paper, we take an initial step by asking a crucial question: Can opioid use be detected using physiological signals obtained from a wrist-mounted sensor? Thirty-six individuals who were admitted to the hospital for an acute painful condition and received opioid analgesics as part of their clinical care were enrolled. Subjects wore a noninvasive wrist sensor during this time (1-14 days) that continuously measured physiological signals (heart rate, skin temperature, accelerometry, electrodermal activity, and interbeat interval). We collected a total of 2070 hours (≈ 86 days) of physiological data and observed a total more » of 339 opioid administrations. Our results are encouraging and show that using a Channel-Temporal Attention TCN (CTA-TCN) model, we can detect an opioid administration in a time-window with an F1-score of 0.80, a specificity of 0.77, sensitivity of 0.80, and an AUC of 0.77. We also predict the exact moment of administration in this time-window with a normalized mean absolute error of 8.6% and R2 coefficient of 0.85. « less
; ; ; ; ;
Award ID(s):
Publication Date:
Journal Name:
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Page Range or eLocation-ID:
1 to 29
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Opioid use disorder is one of the most pressing public health problems of our time. Mobile health tools, including wearable sensors, have great potential in this space, but have been underutilized. Of specific interest are digital biomarkers, or end-user generated physiologic or behavioral measurements that correlate with health or pathology. The current manuscript describes a longitudinal, observational study of adult patients receiving opioid analgesics for acute painful conditions. Participants in the study are monitored with a wrist-worn E4 sensor, during which time physiologic parameters (heart rate/variability, electrodermal activity, skin temperature, and accelerometry) are collected continuously. Opioid use events are recorded via electronic medical record and self-report. Three-hundred thirty-nine discreet dose opioid events from 36 participant are analyzed among 2070 h of sensor data. Fifty-one features are extracted from the data and initially compared pre- and post-opioid administration, and subsequently are used to generate machine learning models. Model performance is compared based on individual and treatment characteristics. The best performing machine learning model to detect opioid administration is a Channel-Temporal Attention-Temporal Convolutional Network (CTA-TCN) model using raw data from the wearable sensor. History of intravenous drug use is associated with better model performance, while middle age, and co-administration of non-narcotic analgesiamore »or sedative drugs are associated with worse model performance. These characteristics may be candidate input features for future opioid detection model iterations. Once mature, this technology could provide clinicians with actionable data on opioid use patterns in real-world settings, and predictive analytics for early identification of opioid use disorder risk.

    « less
  2. Background Increased work through electronic health record (EHR) messaging is frequently cited as a factor of physician burnout. However, studies to date have relied on anecdotal or self-reported measures, which limit the ability to match EHR use patterns with continuous stress patterns throughout the day. Objective The aim of this study is to collect EHR use and physiologic stress data through unobtrusive means that provide objective and continuous measures, cluster distinct patterns of EHR inbox work, identify physicians’ daily physiologic stress patterns, and evaluate the association between EHR inbox work patterns and physician physiologic stress. Methods Physicians were recruited from 5 medical centers. Participants (N=47) were given wrist-worn devices (Garmin Vivosmart 3) with heart rate sensors to wear for 7 days. The devices measured physiological stress throughout the day based on heart rate variability (HRV). Perceived stress was also measured with self-reports through experience sampling and a one-time survey. From the EHR system logs, the time attributed to different activities was quantified. By using a clustering algorithm, distinct inbox work patterns were identified and their associated stress measures were compared. The effects of EHR use on physician stress were examined using a generalized linear mixed effects model. Results Physicians spentmore »an average of 1.08 hours doing EHR inbox work out of an average total EHR time of 3.5 hours. Patient messages accounted for most of the inbox work time (mean 37%, SD 11%). A total of 3 patterns of inbox work emerged: inbox work mostly outside work hours, inbox work mostly during work hours, and inbox work extending after hours that were mostly contiguous to work hours. Across these 3 groups, physiologic stress patterns showed 3 periods in which stress increased: in the first hour of work, early in the afternoon, and in the evening. Physicians in group 1 had the longest average stress duration during work hours (80 out of 243 min of valid HRV data; P=.02), as measured by physiological sensors. Inbox work duration, the rate of EHR window switching (moving from one screen to another), the proportion of inbox work done outside of work hours, inbox work batching, and the day of the week were each independently associated with daily stress duration (marginal R2=15%). Individual-level random effects were significant and explained most of the variation in stress (conditional R2=98%). Conclusions This study is among the first to demonstrate associations between electronic inbox work and physiological stress. We identified 3 potentially modifiable factors associated with stress: EHR window switching, inbox work duration, and inbox work outside work hours. Organizations seeking to reduce physician stress may consider system-based changes to reduce EHR window switching or inbox work duration or the incorporation of inbox management time into work hours.« less
  3. Stress increases the risk of several mental and physical health problems like anxiety, hypertension, and cardiovascular diseases. Better guidance and interventions towards mitigating the impact of stress can be provided if stress can be monitored continuously. The recent proliferation of wearable devices and their capability in measuring several physiological signals related to stress have created the opportunity to measure stress continuously in the wild. Wearable devices used to measure physiological signals are mostly placed on the wrist and the chest. Though currently chest sensors, with/without wrist sensors, provide better results in detecting stress than using wrist sensors only, chest devices are not as convenient and prevalent as wrist devices, particularly in the free-living context. In this paper, we present a solution to detect stress using wrist sensors that emulate the gold standard chest sensors. Data from wrist sensors are translated into the data from chest sensors, and the translated data is used for stress detection without requiring the users to wear any device on the chest. We evaluated our solution using a public dataset, and results show that our solution detects stress with accuracy comparable to the gold standard chest devices which are impractical for daily use
  4. Abstract

    Self-reports indicate that stress increases the risk for smoking; however, intensive data from sensors can provide a more nuanced understanding of stress in the moments leading up to and following smoking events. Identifying personalized dynamical models of stress-smoking responses can improve characterizations of smoking responses following stress, but techniques used to identify these models require intensive longitudinal data. This study leveraged advances in wearable sensing technology and digital markers of stress and smoking to identify person-specific models of stress and smoking system dynamics by considering stress immediately before, during, and after smoking events. Adult smokers (n = 45) wore the AutoSense chestband (respiration-inductive plethysmograph, electrocardiogram, accelerometer) with MotionSense (accelerometers, gyroscopes) on each wrist for three days prior to a quit attempt. The odds of minute-level smoking events were regressed on minute-level stress probabilities to identify person-specific dynamic models of smoking responses to stress. Simulated pulse responses to a continuous stress episode revealed a consistent pattern of increased odds of smoking either shortly after the beginning of the simulated stress episode or with a delay, for all participants. This pattern is followed by a dramatic reduction in the probability of smoking thereafter, for about half of the participants (49%). Sensor-detected stress probabilitiesmore »indicate a vulnerability for smoking that may be used as a tailoring variable for just-in-time interventions to support quit attempts.

    « less
  5. Abstract

    Objective.Reorienting is central to how humans direct attention to different stimuli in their environment. Previous studies typically employ well-controlled paradigms with limited eye and head movements to study the neural and physiological processes underlying attention reorienting. Here, we aim to better understand the relationship between gaze and attention reorienting using a naturalistic virtual reality (VR)-based target detection paradigm.Approach.Subjects were navigated through a city and instructed to count the number of targets that appeared on the street. Subjects performed the task in a fixed condition with no head movement and in a free condition where head movements were allowed. Electroencephalography (EEG), gaze and pupil data were collected. To investigate how neural and physiological reorienting signals are distributed across different gaze events, we used hierarchical discriminant component analysis (HDCA) to identify EEG and pupil-based discriminating components. Mixed-effects general linear models (GLM) were used to determine the correlation between these discriminating components and the different gaze events time. HDCA was also used to combine EEG, pupil and dwell time signals to classify reorienting events.Main results.In both EEG and pupil, dwell time contributes most significantly to the reorienting signals. However, when dwell times were orthogonalized against other gaze events, the distributions of themore »reorienting signals were different across the two modalities, with EEG reorienting signals leading that of the pupil reorienting signals. We also found that the hybrid classifier that integrates EEG, pupil and dwell time features detects the reorienting signals in both the fixed (AUC = 0.79) and the free (AUC = 0.77) condition.Significance.We show that the neural and ocular reorienting signals are distributed differently across gaze events when a subject is immersed in VR, but nevertheless can be captured and integrated to classify target vs. distractor objects to which the human subject orients.

    « less