Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available February 1, 2027
-
Free, publicly-accessible full text available November 18, 2026
-
Free, publicly-accessible full text available August 26, 2026
-
Development of responsive automation necessitates a framework for studying human-automation interactions in a broad range of operating conditions. This study uses a novel experiment design involving multiple binary perturbations in different stimuli to elicit measurable changes in cognitive factors that affect human-decision making during conditionally-automated (SAE Level 3) driving: trust in automation, mental workload, self-confidence, and risk perception. To infer changes in these factors, psychophysiological metrics such as heart rate variability and galvanic skin response, behavioral metrics such as eye gaze and reliance on automation, and self-reports were collected. Findings from statistical tests revealed significant changes, particularly in psychophysiological and behavioral metrics, for some treatments. However, other treatments did not elicit a significant change, highlighting the complexities of a between-subject experiment design with variations in multiple independent variables. Findings also underscore the importance of collecting heterogeneous human data to infer changes in cognitive factors during interactions with automation.more » « less
-
Autonomous systems that can assist humans with increasingly complex tasks are becoming ubiquitous. Moreover, it has been established that a human’s decision to rely on such systems is a function of both their trust in the system and their own self-confidence as it relates to executing the task of interest. Given that both under- and over-reliance on automation can pose significant risks to humans, there is motivation for developing autonomous systems that could appropriately calibrate a human’s trust or self-confidence to achieve proper reliance behavior. In this article, a computational model of coupled human trust and self-confidence dynamics is proposed. The dynamics are modeled as a partially observable Markov decision process without a reward function (POMDP/R) that leverages behavioral and self-report data as observations for estimation of these cognitive states. The model is trained and validated using data collected from 340 participants. Analysis of the transition probabilities shows that the proposed model captures the probabilistic relationship between trust, self-confidence, and reliance for all discrete combinations of high and low trust and self-confidence. The use of the proposed model to design an optimal policy to facilitate trust and self-confidence calibration is a goal of future work.more » « less
An official website of the United States government

Full Text Available