skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: A Preliminary Investigation into Learning Behaviors in Complex Environments for Human-in-the-Loop Cyber-Physical Systems
The field of Cyber-Physical Systems (CPS) is increasingly recognizing the importance of integrating Human Factors for Human-in-the-loop CPS (HiLCPS) developments. This is because psychological, physiological, and behavioral characteristics of humans can be used to predict human-machine interactions. The goal of this pilot study is to collect initial data to determine whether driving and eye tracking metrics can provide evidence of learning for a CPS project. Six participants performed a series of 12 repeated obstacle avoidance tasks in manual driving. Lane deviations and fixation-related eye data were recorded for each trial. Overall, participants displayed either conservation/safe or aggressive/risky in their lateral position with respect to the obstacle during successive trials. Also, eye tracking metrics were not significantly affected by trial number, but observational trends suggest their potential for aiding in understanding adjustments humans make in learning. Results can inform predictive modeling algorithms that can anticipate and mitigate potential problems in real-time.  more » « less
Award ID(s):
1836952
PAR ID:
10344018
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Proceedings of the Human Factors and Ergonomics Society Annual Meeting
Volume:
65
Issue:
1
ISSN:
2169-5067
Page Range / eLocation ID:
42 to 46
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Advanced systems that require shared control are becoming increasingly pervasive. One advantage of a shared control approach is that the human and machine work together to accomplish safe operations. However, data about the human is needed to implement successful strategies. The goal of this study was to quantify naturalistic driving by collecting performance and physiological data during manual, open-loop driving. Sixteen participants performed a single drive that included four sudden obstacles of increasing difficulty (road debris, construction, inclement weather, and an animal). Participants were asked to traverse each obstacle using self-employed judgement and strategies. Action selection, lane deviation, speed, and heart rate data were recorded. Results showed two distinct driving strategies for avoiding the moving obstacle/animal (left vs. right lane navigation). Also, maximum speed was affected by obstacle type, but heart rate variability was not. Results can be used to inform shared control algorithms designed to combat poor driving performance. 
    more » « less
  2. Abstract As technology advances, Human-Robot Interaction (HRI) is boosting overall system efficiency and productivity. However, allowing robots to be present closely with humans will inevitably put higher demands on precise human motion tracking and prediction. Datasets that contain both humans and robots operating in the shared space are receiving growing attention as they may facilitate a variety of robotics and human-systems research. Datasets that track HRI with rich information other than video images during daily activities are rarely seen. In this paper, we introduce a novel dataset that focuses on social navigation between humans and robots in a future-oriented Wholesale and Retail Trade (WRT) environment (https://uf-retail-cobot-dataset.github.io/). Eight participants performed the tasks that are commonly undertaken by consumers and retail workers. More than 260 minutes of data were collected, including robot and human trajectories, human full-body motion capture, eye gaze directions, and other contextual information. Comprehensive descriptions of each category of data stream, as well as potential use cases are included. Furthermore, analysis with multiple data sources and future directions are discussed. 
    more » « less
  3. Abstract Effective interactions between humans and robots are vital to achieving shared tasks in collaborative processes. Robots can utilize diverse communication channels to interact with humans, such as hearing, speech, sight, touch, and learning. Our focus, amidst the various means of interactions between humans and robots, is on three emerging frontiers that significantly impact the future directions of human–robot interaction (HRI): (i) human–robot collaboration inspired by human–human collaboration, (ii) brain-computer interfaces, and (iii) emotional intelligent perception. First, we explore advanced techniques for human–robot collaboration, covering a range of methods from compliance and performance-based approaches to synergistic and learning-based strategies, including learning from demonstration, active learning, and learning from complex tasks. Then, we examine innovative uses of brain-computer interfaces for enhancing HRI, with a focus on applications in rehabilitation, communication, brain state and emotion recognition. Finally, we investigate the emotional intelligence in robotics, focusing on translating human emotions to robots via facial expressions, body gestures, and eye-tracking for fluid, natural interactions. Recent developments in these emerging frontiers and their impact on HRI were detailed and discussed. We highlight contemporary trends and emerging advancements in the field. Ultimately, this paper underscores the necessity of a multimodal approach in developing systems capable of adaptive behavior and effective interaction between humans and robots, thus offering a thorough understanding of the diverse modalities essential for maximizing the potential of HRI. 
    more » « less
  4. Trust calibration poses a significant challenge in the interaction between drivers and automated vehicles (AVs) in the context of human-automation collaboration. To effectively calibrate trust, it becomes crucial to accurately measure drivers’ trust levels in real time, allowing for timely interventions or adjustments in the automated driving. One viable approach involves employing machine learning models and physiological measures to model the dynamic changes in trust. This study introduces a technique that leverages machine learning models to predict drivers’ real-time dynamic trust in conditional AVs using physiological measurements. We conducted the study in a driving simulator where participants were requested to take over control from automated driving in three conditions that included a control condition, a false alarm condition, and a miss condition. Each condition had eight takeover requests (TORs) in different scenarios. Drivers’ physiological measures were recorded during the experiment, including galvanic skin response (GSR), heart rate (HR) indices, and eye-tracking metrics. Using five machine learning models, we found that eXtreme Gradient Boosting (XGBoost) performed the best and was able to predict drivers’ trust in real time with an f1-score of 89.1% compared to a baseline model of K -nearest neighbor classifier of 84.5%. Our findings provide good implications on how to design an in-vehicle trust monitoring system to calibrate drivers’ trust to facilitate interaction between the driver and the AV in real time. 
    more » « less
  5. This study aims to systematically evaluate the use of social network analysis (SNA) metrics to measure eye-tracking behavior to assess and predict student learning performance. We integrated 11 network metrics from published research and tested them on six eye-tracking datasets. Our preliminary results indicate that no consistent predictor variable can effectively predict student performance across different datasets. The number of nodes, edges, reciprocity, and entropy measures contribute differently to predicting students’ performance. This work deepens our understanding of how different SNA metrics relate to eye-tracking data and advances the methodological framework to predict learning outcomes. 
    more » « less