Nearly half of people prescribed medication to treat chronic or short-term conditions do not take their medicine as prescribed. This leads to worse treatment outcomes, higher hospital admission rates, increased healthcare costs, and increased morbidity and mortality rates. While some instances of medication non-adherence are a result of problems with the treatment plan or barriers caused by the health care provider, many are instances caused by patient-related factors such as forgetting, running out of medication, and not understanding the required dosages. This presents a clear need for patient-centered systems that can reliably increase medication adherence. To that end, in this work we describe an activity recognition system capable of recognizing when individuals take medication in an unconstrained, real-world environment. Our methodology uses a modified version of the Bagging ensemble method to suit unbalanced data and a classifier trained on the prediction probabilities of the Bagging classifier to identify when individuals took medication during a full-day study. Using this methodology we are able to recognize when individuals took medication with an F-measure of 0.77. Our system is a first step towards developing personal health interfaces that are capable of providing personalized medication adherence interventions.
This content will become publicly available on June 16, 2023
PERACTIV: Personalized Activity Monitoring - Ask My Hands
Medication adherence is a major problem in the healthcare industry: it has a major impact on an individual’s health and is a major expense on the healthcare system. We note that much of human activity involves using our hands, often in conjunction with objects. Camera-based wearables for tracking human activities have sparked a lot of attention in the past few years. These technologies have the potential to track human behavior anytime, any place. This paper proposes a paradigm for medication adherence employing innovative wrist-worn camera technology. We discuss how the device was built, various experiments to demonstrate feasibility and how the device could be deployed to detect the micro-activities involved in pill taking so as to ensure medication adherence.
- Award ID(s):
- 1828010
- Publication Date:
- NSF-PAR ID:
- 10344382
- Journal Name:
- International Conference on Human-Computer Interaction (HCII) 2022. Lecture Notes in Computer Science
- Volume:
- 13326
- Page Range or eLocation-ID:
- 255–272
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Hand hygiene is crucial in preventing the spread of infections and diseases. Lack of hand hygiene is one of the major reasons for healthcare associated infections (HAIs) in hospitals. Adherence to hand hygiene compliance by the workers in the food business is very important for preventing food-borne illness. In addition to healthcare settings and food businesses, hand washing is also vital for personal well-being. Despite the importance of hand hygiene, people often do not wash hands when necessary. Automatic detection of hand washing activity can facilitate justin-time alerts when a person forgets to wash hands. Monitoring hand washing practices is also essential in ensuring accountability and providing personalized feedback, particularly in hospitals and food businesses. Inertial sensors available in smart wrist devices can capture hand movements, and so it is feasible to detect hand washing using these devices. However, it is challenging to detect hand washing using wrist wearable sensors since hand movements are associated with a wide range of activities. In this paper, we present HAWAD, a robust solution for hand washing detection using wrist wearable inertial sensors. We leverage the distribution of penultimate layer output of a neural network to detect hand washing from a wide range ofmore »
-
Background Sustained engagement is essential for the success of telerehabilitation programs. However, patients’ lack of motivation and adherence could undermine these goals. To overcome this challenge, physical exercises have often been gamified. Building on the advantages of serious games, we propose a citizen science–based approach in which patients perform scientific tasks by using interactive interfaces and help advance scientific causes of their choice. This approach capitalizes on human intellect and benevolence while promoting learning. To further enhance engagement, we propose performing citizen science activities in immersive media, such as virtual reality (VR). Objective This study aims to present a novel methodology to facilitate the remote identification and classification of human movements for the automatic assessment of motor performance in telerehabilitation. The data-driven approach is presented in the context of a citizen science software dedicated to bimanual training in VR. Specifically, users interact with the interface and make contributions to an environmental citizen science project while moving both arms in concert. Methods In all, 9 healthy individuals interacted with the citizen science software by using a commercial VR gaming device. The software included a calibration phase to evaluate the users’ range of motion along the 3 anatomical planes of motion andmore »
-
People living with HIV experience a high level of stigma in our society. Public HIV-related stigma often leads to anxiety and depression and hinders access to social support and proper medical care. Technologies for HIV, however, have been mainly designed for treatment management and medication adherence rather than for helping people cope with public HIV-related stigma specifically. Drawing on empirical data obtained from semi-structured interviews and design activities with eight social workers and 29 people living with HIV, we unpack the ways in which needs for privacy and trust, intimacy, and social support create tensions around key coping strategies. Reflecting on these tensions, we present design implications and opportunities to empower people living with HIV to cope with public HIV-related stigma at the individual level.
-
Human activity recognition (HAR) is the process of using mobile sensor data to determine the physical activities performed by individuals. HAR is the backbone of many mobile healthcare applications, such as passive health monitoring systems, early diagnosing systems, and fall detection systems. Effective HAR models rely on deep learning architectures and big data in order to accurately classify activities. Unfortunately, HAR datasets are expensive to collect, are often mislabeled, and have large class imbalances. State-of-the-art approaches to address these challenges utilize Generative Adversarial Networks (GANs) for generating additional synthetic data along with their labels. Problematically, these HAR GANs only synthesize continuous features — features that are represented by real numbers — recorded from gyroscopes, accelerometers, and other sensors that produce continuous data. This is limiting since mobile sensor data commonly has discrete features that provide additional context such as device location and the time-of-day, which have been shown to substantially improve HAR classification. Hence, we studied Conditional Tabular Generative Adversarial Networks (CTGANs) for data generation to synthesize mobile sensor data containing both continuous and discrete features, a task never been done by state-of-the-art approaches. We show HAR-CTGANs generate data with greater realism resulting in allowing better downstream performance in HARmore »