skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: BoostMeUp: Improving Cognitive Performance in the Moment by Unobtrusively Regulating Emotions with a Smartwatch
A person's emotional state can strongly influence their ability to achieve optimal task performance. Aiming to help individuals manage their feelings, different emotion regulation technologies have been proposed. However, despite the well-known influence that emotions have on task performance, no study to date has shown if an emotion regulation technology can also enhance user's cognitive performance in the moment. In this paper, we present BoostMeUp, a smartwatch intervention designed to improve user's cognitive performance by regulating their emotions unobtrusively. Based on studies that show that people tend to associate external signals that resemble heart rates as their own, the intervention provides personalized haptic feedback simulating a different heart rate. Users can focus on their tasks and the intervention acts upon them in parallel, without requiring any additional action. The intervention was evaluated in an experiment with 72 participants, in which they had to do math tests under high pressure. Participants who were exposed to slow haptic feedback during the tests decreased their anxiety, increased their heart rate variability and performed better in the math tests, while fast haptic feedback led to the opposite effects. These results indicate that the BoostMeUp intervention can lead to positive cognitive, physiological and behavioral changes.  more » « less
Award ID(s):
1840025
PAR ID:
10603606
Author(s) / Creator(s):
 ;  ;  ;  
Publisher / Repository:
Association for Computing Machinery (ACM)
Date Published:
Journal Name:
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume:
3
Issue:
2
ISSN:
2474-9567
Format(s):
Medium: X Size: p. 1-23
Size(s):
p. 1-23
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Emotion regulation can be characterized by different activities that attempt to alter an emotional response, whether behavioral, physiological or neurological. The two most widely adopted strategies, cognitive reappraisal and expressive suppression are explored in this study, specifically in the context of disgust. Study participants (N = 21) experienced disgust via video exposure, and were instructed to either regulate their emotions or express them freely. If regulating, they were required to either cognitively reappraise or suppress their emotional experiences while viewing the videos. Video recordings of the participants' faces were taken during the experiment and electrocardiogram (ECG), electromyography (EMG), and galvanic skin response (GSR) readings were also collected for further analysis. We compared the participants behavioral (facial musculature movements) and physiological (GSR and heart rate) responses as they aimed to alter their emotional responses and computationally determined that when responding to disgust stimuli, the signals recorded during suppression and free expression were very similar, whereas those recorded during cognitive reappraisal were significantly different. Thus, in the context of this study, from a signal analysis perspective, we conclude that emotion regulation via cognitive reappraisal significantly alters participants' physiological responses to disgust, unlike regulation via suppression. 
    more » « less
  2. As the influence of social robots in people’s daily lives grows, research on understanding people’s perception of robots including sociability, trust, acceptance, and preference becomes more pervasive. Research has considered visual, vocal, or tactile cues to express robots’ emotions, whereas little research has provided a holistic view in examining the interactions among different factors influencing emotion perception. We investigated multiple facets of user perception on robots during a conversational task by varying the robots’ voice types, appearances, and emotions. In our experiment, 20 participants interacted with two robots having four different voice types. While participants were reading fairy tales to the robot, the robot gave vocal feedback with seven emotions and the participants evaluated the robot’s profiles through post surveys. The results indicate that (1) the accuracy of emotion perception differed depending on presented emotions, (2) a regular human voice showed higher user preferences and naturalness, (3) but a characterized voice was more appropriate for expressing emotions with significantly higher accuracy in emotion perception, and (4) participants showed significantly higher emotion recognition accuracy with the animal robot than the humanoid robot. A follow-up study ([Formula: see text]) with voice-only conditions confirmed that the importance of embodiment. The results from this study could provide the guidelines needed to design social robots that consider emotional aspects in conversations between robots and users. 
    more » « less
  3. Abstract Traditionally, lust and pride have been considered pleasurable, yet sinful in the West. Conversely, guilt is often considered aversive, yet valuable. These emotions illustrate how evaluations about specific emotions and beliefs about their hedonic properties may often diverge. Evaluations about specific emotions may shape important aspects of emotional life (e.g. in emotion regulation, emotion experience and acquisition of emotion concepts). Yet these evaluations are often understudied in affective neuroscience. Prior work in emotion regulation, affective experience, evaluation/attitudes and decision-making point to anterior prefrontal areas as candidates for supporting evaluative emotion knowledge. Thus, we examined the brain areas associated with evaluative and hedonic emotion knowledge, with a focus on the anterior prefrontal cortex. Participants (N = 25) made evaluative and hedonic ratings about emotion knowledge during functional magnetic resonance imaging (fMRI). We found that greater activity in the medial prefrontal cortex (mPFC), ventromedial PFC (vmPFC) and precuneus was associated with an evaluative (vs hedonic) focus on emotion knowledge. Our results suggest that the mPFC and vmPFC, in particular, may play a role in evaluating discrete emotions. 
    more » « less
  4. In this paper, we design and evaluate a novel form of visually-simulated haptic feedback cue for communicating weight in robot teleoperation. We propose that a visuo-proprioceptive cue results from inconsistencies created between the user's visual and proprioceptive senses when the robot's movement differs from the movement of the user's input. In a user study where participants teleoperate a six-DoF robot arm, we demonstrate the feasibility of using such a cue for communicating weight in four telemanipulation tasks to enhance user experience and task performance. 
    more » « less
  5. null (Ed.)
    Background With nearly 20% of the US adult population using fitness trackers, there is an increasing focus on how physiological data from these devices can provide actionable insights about workplace performance. However, in-the-wild studies that understand how these metrics correlate with cognitive performance measures across a diverse population are lacking, and claims made by device manufacturers are vague. While there has been extensive research leading to a variety of theories on how physiological measures affect cognitive performance, virtually all such studies have been conducted in highly controlled settings and their validity in the real world is poorly understood. Objective We seek to bridge this gap by evaluating prevailing theories on the effects of a variety of sleep, activity, and heart rate parameters on cognitive performance against data collected in real-world settings. Methods We used a Fitbit Charge 3 and a smartphone app to collect different physiological and neurobehavioral task data, respectively, as part of our 6-week-long in-the-wild study. We collected data from 24 participants across multiple population groups (shift workers, regular workers, and graduate students) on different performance measures (vigilant attention and cognitive throughput). Simultaneously, we used a fitness tracker to unobtrusively obtain physiological measures that could influence these performance measures, including over 900 nights of sleep and over 1 million minutes of heart rate and physical activity metrics. We performed a repeated measures correlation (rrm) analysis to investigate which sleep and physiological markers show association with each performance measure. We also report how our findings relate to existing theories and previous observations from controlled studies. Results Daytime alertness was found to be significantly correlated with total sleep duration on the previous night (rrm=0.17, P<.001) as well as the duration of rapid eye movement (rrm=0.12, P<.001) and light sleep (rrm=0.15, P<.001). Cognitive throughput, by contrast, was not found to be significantly correlated with sleep duration but with sleep timing—a circadian phase shift toward a later sleep time corresponded with lower cognitive throughput on the following day (rrm=–0.13, P<.001). Both measures show circadian variations, but only alertness showed a decline (rrm=–0.1, P<.001) as a result of homeostatic pressure. Both heart rate and physical activity correlate positively with alertness as well as cognitive throughput. Conclusions Our findings reveal that there are significant differences in terms of which sleep-related physiological metrics influence each of the 2 performance measures. This makes the case for more targeted in-the-wild studies investigating how physiological measures from self-tracking data influence, or can be used to predict, specific aspects of cognitive performance. 
    more » « less