skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Survey on Physiological Computing in Human–Robot Collaboration
Human–robot collaboration has emerged as a prominent research topic in recent years. To enhance collaboration and ensure safety between humans and robots, researchers employ a variety of methods. One such method is physiological computing, which aims to estimate a human’s psycho-physiological state by measuring various physiological signals such as galvanic skin response (GSR), electrocardiograph (ECG), heart rate variability (HRV), and electroencephalogram (EEG). This information is then used to provide feedback to the robot. In this paper, we present the latest state-of-the-art methods in physiological computing for human–robot collaboration. Our goal is to provide a comprehensive guide for new researchers to understand the commonly used physiological signals, data collection methods, and data labeling techniques. Additionally, we have categorized and tabulated relevant research to further aid in understanding this area of study.  more » « less
Award ID(s):
2125362
PAR ID:
10421888
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Machines
Volume:
11
Issue:
5
ISSN:
2075-1702
Page Range / eLocation ID:
536
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    This paper introduces a new ROSbag-based multimodal affective dataset for emotional and cognitive states generated using the Robot Operating System (ROS). We utilized images and sounds from the International Affective Pictures System (IAPS) and the International Affective Digitized Sounds (IADS) to stimulate targeted emotions (happiness, sadness, anger, fear, surprise, disgust, and neutral), and a dual N-back game to stimulate different levels of cognitive workload. 30 human subjects participated in the user study; their physiological data were collected using the latest commercial wearable sensors, behavioral data were collected using hardware devices such as cameras, and subjective assessments were carried out through questionnaires. All data were stored in single ROSbag files rather than in conventional Comma-Separated Values (CSV) files. This not only ensures synchronization of signals and videos in a data set, but also allows researchers to easily analyze and verify their algorithms by connecting directly to this dataset through ROS. The generated affective dataset consists of 1,602 ROSbag files, and the size of the dataset is about 787GB. The dataset is made publicly available. We expect that our dataset can be a great resource for many researchers in the fields of affective computing, Human-Computer Interaction (HCI), and Human-Robot Interaction (HRI). 
    more » « less
  2. null (Ed.)
    Abstract This study presents robot-based rehabilitation and its assessment. Robotic devices have significantly been useful to help therapists do the training procedure consistently. However, as robotic devices interface with humans, quantifying the interaction and its intended outcomes is still a research challenge. In this study, human–robot interaction during rehabilitation is assessed based on measurable interaction forces and human physiological response data, and correlations are established to plan the intervention and effective limb trajectories within the intended rehabilitation and interaction forces. In this study, the Universal Robot 5 (UR5) is used to guide and support the arm of a subject over a predefined trajectory while recording muscle activities through surface electromyography (sEMG) signals using the Trigno wireless DELSYS devices. The interaction force is measured through the force sensor mounted on the robot end-effector. The force signals and the human physiological data are analyzed and classified to infer the related progress. Feature reduction and selection techniques are used to identify redundant inputs and outputs. 
    more » « less
  3. We present the Human And Robot Multimodal Observations of Natural Interactive Collaboration (HARMONIC) dataset. This is a large multimodal dataset of human interactions with a robotic arm in a shared autonomy setting designed to imitate assistive eating. The dataset provides human, robot, and environmental data views of 24 different people engaged in an assistive eating task with a 6-degree-of-freedom (6-DOF) robot arm. From each participant, we recorded video of both eyes, egocentric video from a head-mounted camera, joystick commands, electromyography from the forearm used to operate the joystick, third-person stereo video, and the joint positions of the 6-DOF robot arm. Also included are several features that come as a direct result of these recordings, such as eye gaze projected onto the egocentric video, body pose, hand pose, and facial keypoints. These data streams were collected specifically because they have been shown to be closely related to human mental states and intention. This dataset could be of interest to researchers studying intention prediction, human mental state modeling, and shared autonomy. Data streams are provided in a variety of formats such as video and human-readable CSV and YAML files. 
    more » « less
  4. Effective collaboration between humans and robots necessitates that the robotic partner can perceive, learn from, and respond to the human's psycho-physiological conditions. This involves understanding the emotional states of the human collaborator. To explore this, we collected subjective assessments - specifically, feelings of surprise, anxiety, boredom, calmness, and comfort — as well as physiological signals during a dynamic human-robot interaction experiment. The experiment manipulated the robot's behavior to observe these responses. We gathered data from this non-stationary setting and trained an artificial neural network model to predict human emotion from physiological data. We found that using several subjects' data to train a general model and then fine-tuning it on the subject of interest performs better than training a model only using the subject of interest data. 
    more » « less
  5. Researchers in human–robot collaboration have extensively studied methods for inferring human intentions and predicting their actions, as this is an important precursor for robots to provide useful assistance. We review contemporary methods for intention inference and human activity prediction. Our survey finds that intentions and goals are often inferred via Bayesian posterior estimation and Markov decision processes that model internal human states as unobserved variables or represent both agents in a shared probabilistic framework. An alternative approach is to use neural networks and other supervised learning approaches to directly map observable outcomes to intentions and to make predictions about future human activity based on past observations. That said, due to the complexity of human intentions, existing work usually reasons about limited domains, makes unrealistic simplifications about intentions, and is mostly constrained to short-term predictions. This state of the art provides opportunity for future research that could include more nuanced models of intents, reason over longer horizons, and account for the human tendency to adapt. 
    more » « less