Most mobile health apps employ data visualization to help people view their health and activity data, but these apps provide limited support for visual data exploration. Furthermore, despite its huge potential benefits, mobile visualization research in the personal data context is sparse. This work aims to empower people to easily navigate and compare their personal health data on smartphones by enabling flexible time manipulation with speech. We designed and developed Data@Hand, a mobile app that leverages the synergy of two complementary modalities: speech and touch. Through an exploratory study with 13 long-term Fitbit users, we examined how multimodal interaction helps participants explore their own health data. Participants successfully adopted multimodal interaction (i.e., speech and touch) for convenient and fluid data exploration. Based on the quantitative and qualitative findings, we discuss design implications and opportunities with multimodal interaction for better supporting visual data exploration on mobile devices.
more »
« less
Multimodal Time-Series Activity Forecasting for Adaptive Lifestyle Intervention Design
Physical activity is a cornerstone of chronic conditions and one of the most critical factors in reducing the risks of cardiovascular diseases, the leading cause of death in the United States. App-based lifestyle interventions have been utilized to promote physical activity in people with or at risk for chronic conditions. However, these mHealth tools have remained largely static and do not adapt to the changing behavior of the user. In a step toward designing adaptive interventions, we propose BeWell24Plus, a framework for monitoring activity and user engagement and developing computational models for outcome prediction and intervention design. In particular, we focus on devising algorithms that combine data about physical activity and engagement with the app to predict future physical activity performance. Knowing in advance how active a person is going to be in the next day can help with designing adaptive interventions that help individuals achieve their physical activity goals. Our technique combines the recent history of a person's physical activity with app engagement metrics such as when, how often, and for how long the app was used to forecast the near future's activity. We formulate the problem of multimodal activity forecasting and propose an LSTM-based realization of our proposed model architecture, which estimates physical activity outcomes in advance by examining the history of app usage and physical activity of the user. We demonstrate the effectiveness of our forecasting approach using data collected with 58 prediabetic people in a 9-month user study. We show that our multimodal forecasting approach outperforms single-modality forecasting by 2.2$ to 11.1% in mean-absolute-error.
more »
« less
- PAR ID:
- 10389477
- Date Published:
- Journal Name:
- 2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)
- Page Range / eLocation ID:
- 1 to 4
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
It is not well understood why people continue to use privacy-invasive apps they consider creepy. We conducted a scenario-based study (n = 751) to investigate how the intention to use an app is influenced by affective perceptions and privacy concerns. We show that creepiness is one facet of affective discomfort, which is becoming normalized in app use. We found that affective discomfort can be negatively associated with the intention to use a privacy-invasive app. However, the influence is mitigated by other factors, including data literacy, views regarding app data practices, and ambiguity of the privacy threat. Our findings motivate a focus on affective discomfort when designing user experiences related to privacy-invasive data practices. Treating affective discomfort as a fundamental aspect of user experience requires scaling beyond the point where the thumb meets the screen and accounting for entrenched data practices and the sociotechnical landscape within which the practices are embedded.more » « less
-
Chinn, C; Tan, E.; Chan, C; Kali Y. (Ed.)From a design-based research study investigating rural families’ science learning with mobile devices, we share findings related to the intergenerational exploration of geological time concepts at a children’s garden at a university arboretum. The team developed a mobile augmented reality app, Time Explorers, focused on how millions of years of rock-water interactions shaped Appalachia. Data are recorded videos of app usage and interviews from 17 families (51 people); videos were transcribed, coded, and developed into qualitative case studies. We present results related to design elements that supported sensory engagement (e.g., observation, touch) through AR visualizations related to geological history. This analysis contributes to the literature on informal learning environments, theory related to learning-on- the-move, and the role of sensory engagement with AR experiences in outdoor learning.more » « less
-
This work describes the design of real-time dance-based interaction with a humanoid robot, where the robot seeks to promote physical activity in children by taking on multiple roles as a dance partner. It acts as a leader by initiating dances but can also act as a follower by mimicking a child’s dance movements. Dances in the leader role are produced by a sequence-to-sequence (S2S) Long Short-Term Memory (LSTM) network trained on children’s music videos taken from YouTube. On the other hand, a music orchestration platform is implemented to generate background music in the follower mode as the robot mimics the child’s poses. In doing so, we also incorporated the largely unexplored paradigm of learning-by-teaching by including multiple robot roles that allow the child to both learn from and teach to the robot. Our work is among the first to implement a largely autonomous, real-time full-body dance interaction with a bipedal humanoid robot that also explores the impact of the robot roles on child engagement. Importantly, we also incorporated in our design formal constructs taken from autism therapy, such as the least-to-most prompting hierarchy, reinforcements for positive behaviors, and a time delay to make behavioral observations. We implemented a multimodal child engagement model that encompasses both affective engagement (displayed through eye gaze focus and facial expressions) as well as task engagement (determined by the level of physical activity) to determine child engagement states. We then conducted a virtual exploratory user study to evaluate the impact of mixed robot roles on user engagement and found no statistically significant difference in the children’s engagement in single-role and multiple-role interactions. While the children were observed to respond positively to both robot behaviors, they preferred the music-driven leader role over the movement-driven follower role, a result that can partly be attributed to the virtual nature of the study. Our findings support the utility of such a platform in practicing physical activity but indicate that further research is necessary to fully explore the impact of each robot role.more » « less
-
Battery-free sensing devices harvest energy from their surrounding environment to perform sensing, computation, and communication. This enables previously impossible applications in the Internet-of-Things. A core challenge for these devices is maintaining usefulness despite erratic, random or irregular energy availability; which causes inconsistent execution, loss of service and power failures. Adapting execution (degrading or upgrading) seems promising as a way to stave off power failures, meet deadlines, or increase throughput. However, because of constrained resources and limited local information, it is a challenge to decide when would be the best time to adapt, and how exactly to adapt execution. In this paper, we systematically explore the fundamental mechanisms of energy-aware adaptation, and propose heuristic adaptation as a method for modulating the performance of tasks to enable higher sensor coverage, completion rates, or throughput, depending on the application. We build a task based adaptive runtime system for intermittently powered sensors embodying this concept. We complement this runtime with a user facing simulator that enables programmers to conceptualize the tradeoffs they make when choosing what tasks to adapt, and how, relative to real world energy harvesting environment traces. While we target battery-free, intermittently powered sensors, we see general application to all energy harvesting devices. We explore heuristic adaptation with varied energy harvesting modalities and diverse applications: machine learning, activity recognition, and greenhouse monitoring, and find that the adaptive version of our ML app performs up to 46% more classifications with only a 5% drop in accuracy; the activity recognition app captures 76% more classifications with only nominal down-sampling; and find that heuristic adaptation leads to higher throughput versus non-adaptive in all cases.more » « less