skip to main content


This content will become publicly available on September 18, 2025

Title: The Impact of Cybersecurity Attacks on Human Trust in Autonomous Vehicle Operations
Objective

This study examines the extent to which cybersecurity attacks on autonomous vehicles (AVs) affect human trust dynamics and driver behavior.

Background

Human trust is critical for the adoption and continued use of AVs. A pressing concern in this context is the persistent threat of cyberattacks, which pose a formidable threat to the secure operations of AVs and consequently, human trust.

Method

A driving simulator experiment was conducted with 40 participants who were randomly assigned to one of two groups: (1) Experience and Feedback and (2) Experience-Only. All participants experienced three drives: Baseline, Attack, and Post-Attack Drive. The Attack Drive prevented participants from properly operating the vehicle in multiple incidences. Only the “Experience and Feedback” group received a security update in the Post-Attack drive, which was related to the mitigation of the vehicle’s vulnerability. Trust and foot positions were recorded for each drive.

Results

Findings suggest that attacks on AVs significantly degrade human trust, and remains degraded even after an error-less drive. Providing an update about the mitigation of the vulnerability did not significantly affect trust repair.

Conclusion

Trust toward AVs should be analyzed as an emergent and dynamic construct that requires autonomous systems capable of calibrating trust after malicious attacks through appropriate experience and interaction design.

Application

The results of this study can be applied when building driver and situation-adaptive AI systems within AVs.

 
more » « less
PAR ID:
10542734
Author(s) / Creator(s):
 ;  ;  ;  
Publisher / Repository:
SAGE Publications
Date Published:
Journal Name:
Human Factors: The Journal of the Human Factors and Ergonomics Society
ISSN:
0018-7208
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Objective

    This study investigated the impact of driving styles of drivers and automated vehicles (AVs) on drivers’ perception of automated driving maneuvers and quantified the relationships among drivers’ perception of AV maneuvers, driver trust, and acceptance of AVs.

    Background

    Previous studies on automated driving styles focused on the impact of AV’s global driving style on driver’s attitude and driving performance. However, research on drivers’ perception of automated driving maneuvers at the specific driving style level is still lacking.

    Method

    Sixteen aggressive drivers and sixteen defensive drivers were recruited to experience twelve driving scenarios in either an aggressive AV or a defensive AV on the driving simulator. Their perception of AV maneuvers, trust, and acceptance was measured via questionnaires, and driving performance was collected via the driving simulator.

    Results

    Results revealed that drivers’ trust and acceptance of AVs would decrease significantly if they perceived AVs to have a higher speed, larger deceleration, smaller deceleration, or shorter stopping distance than expected. Moreover, defensive drivers perceived significantly greater inappropriateness of these maneuvers from aggressive AVs than defensive AVs, whereas aggressive drivers didn’t differ significantly in their perceived inappropriateness of these maneuvers with different driving styles.

    Conclusion

    The driving styles of automated vehicles and drivers influenced drivers’ perception of automated driving maneuvers, which influence their trust and acceptance of AVs.

    Application

    This study suggested that the design of AVs should consider drivers’ perceptions of automated driving maneuvers to avoid undermining drivers’ trust and acceptance of AVs.

     
    more » « less
  2. null (Ed.)
    Autonomous Vehicle (AV) technology has the potential to significantly improve driver safety. Unfortunately, driver could be reluctant to ride with AVs due to the lack of trust and acceptance of AV’s driving styles. The present study investigated the impact of driver’s driving style (aggressive/defensive) and the designed driving styles of AVs (aggressive/defensive) on driver’s trust, acceptance, and take-over behavior in fully autonomous vehicles. Thirty-two participants were classified into two groups based on their driving styles using the Aggressive Driving Scale and experienced twelve scenarios in either an aggressive AV or a defensive AV. Results revealed that drivers’ trust, acceptance, and takeover frequency were significantly influenced by the interaction effects between AV’s driving style and driver’s driving style. The findings implied that driver’s individual differences should be considered in the design of AV’s driving styles to enhance driver’s trust and acceptance of AVs and reduce undesired take over behaviors. 
    more » « less
  3. Multi-sensor fusion has been widely used by autonomous vehicles (AVs) to integrate the perception results from different sensing modalities including LiDAR, camera and radar. Despite the rapid development of multi-sensor fusion systems in autonomous driving, their vulnerability to malicious attacks have not been well studied. Although some prior works have studied the attacks against the perception systems of AVs, they only consider a single sensing modality or a camera-LiDAR fusion system, which can not attack the sensor fusion system based on LiDAR, camera, and radar. To fill this research gap, in this paper, we present the first study on the vulnerability of multi-sensor fusion systems that employ LiDAR, camera, and radar. Specifically, we propose a novel attack method that can simultaneously attack all three types of sensing modalities using a single type of adversarial object. The adversarial object can be easily fabricated at low cost, and the proposed attack can be easily performed with high stealthiness and flexibility in practice. Extensive experiments based on a real-world AV testbed show that the proposed attack can continuously hide a target vehicle from the perception system of a victim AV using only two small adversarial objects. 
    more » « less
  4. Objective

    We examine how human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation.

    Background

    Most existing studies measured trust by administering questionnaires at the end of an experiment. Only a limited number of studies viewed trust as a dynamic variable that can strengthen or decay over time.

    Method

    Seventy-five participants took part in an aided memory recognition task. In the task, participants viewed a series of images and later on performed 40 trials of the recognition task to identify a target image when it was presented with a distractor. In each trial, participants performed the initial recognition by themselves, received a recommendation from an automated decision aid, and performed the final recognition. After each trial, participants reported their trust on a visual analog scale.

    Results

    Outcome bias and contrast effect significantly influence human operators’ trust adjustments. An automation failure leads to a larger trust decrement if the final outcome is undesirable, and a marginally larger trust decrement if the human operator succeeds the task by him/herself. An automation success engenders a greater trust increment if the human operator fails the task. Additionally, automation failures have a larger effect on trust adjustment than automation successes.

    Conclusion

    Human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation. Their trust adjustments are significantly influenced by decision-making heuristics/biases.

    Application

    Understanding the trust adjustment process enables accurate prediction of the operators’ moment-to-moment trust in automation and informs the design of trust-aware adaptive automation.

     
    more » « less
  5. Trust in automation has been mainly studied in the cognitive perspective, though some researchers have shown that trust is also influenced by emotion. Therefore, it is essential to investigate the relationships between emotions and trust. In this study, we explored the pattern of 19 anticipated emotions associated with two levels of trust (i.e., low vs. high levels of trust) elicited from two levels of autonomous vehicles (AVs) performance (i.e., failure and non-failure) from 105 participants from Amazon Mechanical Turk (AMT). Trust was assessed at three layers i.e., dispositional, initial learned, and situational trust. The study was designed to measure how emotions are affected with low and high levels of trust. Situational trust was significantly correlated with emotions that a high level of trust significantly improved participants’ positive emotions, and vice versa. We also identified the underlying factors of emotions associated with situational trust. Our results offered important implications on anticipated emotions associated with trust in AVs. 
    more » « less