In the present paper, we present a user study with an advanced-driver assistance system (ADAS) using augmented reality (AR) cues to highlight pedestrians and vehicles when approaching intersections of varying complexity. Our major goal is to understand the relationship between the presence and absence of AR, driver-initiated takeover rates and glance behavior when using a SAE Level 2 autonomous vehicle. Therefore, a user-study with eight participants on a medium-fidelity driving simulator was carried out. Overall, we found that AR cues can provide promising means to increase the system transparency, drivers’ situation awareness and trust in the system. Yet, we suggest that the dynamic glance allocation of attention during partially automated vehicles is still challenging for researchers as we still have much to understand and explore when AR cues become a distractor instead of an attention guider.
more »
« less
The Effect of Augmented Reality Cues on Glance Behavior and Driver-Initiated Takeover on SAE Level 2 Automated-Driving
In the present paper, we present a user study with an advanced-driver assistance system (ADAS) using augmented reality (AR) cues to highlight pedestrians and vehicles when approaching intersections of varying complexity. Our major goal is to understand the relationship between the presence and absence of AR, driver-initiated takeover rates and glance behavior when using a SAE Level 2 autonomous vehicle. Therefore, a user-study with eight participants on a medium-fidelity driving simulator was carried out. Overall, we found that AR cues can provide promising means to increase the system transparency, drivers’ situation awareness and trust in the system. Yet, we suggest that the dynamic glance allocation of attention during partially automated vehicles is still challenging for researchers as we still have much to understand and explore when AR cues become a distractor instead of an attention guider.
more »
« less
- Award ID(s):
- 1816721
- PAR ID:
- 10283681
- Date Published:
- Journal Name:
- Proceedings of the Human Factors and Ergonomics Society Annual Meeting
- ISSN:
- 2169-5067
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
As the automotive industry progresses towards the car of the future, we have seen increasing interest using augmented reality (AR) head-up displays (HUD) in driving. AR HUDs provide a fundamentally new driving experience in which drivers still have to respond to both the road and the information provided by the system, creating the perfect atmosphere for potentially unsafe and distracting interfaces. As we start fielding and designing for new AR HUDs displays, the complexities of interface design and its impacts on driver performance must be further understood before AR HUDs can be broadly and safely incorporated into vehicles. Nevertheless, existing methods for assessing the usefulness of computer-based user interfaces may not be sufficiently rich to measure the overall impact of AR HUD interfaces on human performance. Therefore, in my Ph.D. research, I focus on developing and testing methods to evaluate AR HUDs' effects on driver distraction and performance. My primary goal is to assess glance allocation and visual capabilities of drivers with AR HUDs and apply this knowledge to inform new methods of AR HUD assessment that account for inattentional blindness and cognitive tunneling.more » « less
-
In a future of pervasive augmented reality (AR), AR systems will need to be able to efficiently draw or guide the attention of the user to visual points of interest in their physical-virtual environment. Since AR imagery is overlaid on top of the user's view of their physical environment, these attention guidance techniques must not only compete with other virtual imagery, but also with distracting or attention-grabbing features in the user's physical environment. Because of the wide range of physical-virtual environments that pervasive AR users will find themselves in, it is difficult to design visual cues that “pop out” to the user without performing a visual analysis of the user's environment, and changing the appearance of the cue to stand out from its surroundings. In this paper, we present an initial investigation into the potential uses of dichoptic visual cues for optical see-through AR displays, specifically cues that involve having a difference in hue, saturation, or value between the user's eyes. These types of cues have been shown to be preattentively processed by the user when presented on other stereoscopic displays, and may also be an effective method of drawing user attention on optical see-through AR displays. We present two user studies: one that evaluates the saliency of dichoptic visual cues on optical see-through displays, and one that evaluates their subjective qualities. Our results suggest that hue-based dichoptic cues or “Forbidden Colors” may be particularly effective for these purposes, achieving significantly lower error rates in a pop out task compared to value-based and saturation-based cues.more » « less
-
null (Ed.)Objective We controlled participants’ glance behavior while using head-down displays (HDDs) and head-up displays (HUDs) to isolate driving behavioral changes due to use of different display types across different driving environments. Background Recently, HUD technology has been incorporated into vehicles, allowing drivers to, in theory, gather display information without moving their eyes away from the road. Previous studies comparing the impact of HUDs with traditional displays on human performance show differences in both drivers’ visual attention and driving performance. Yet no studies have isolated glance from driving behaviors, which limits our ability to understand the cause of these differences and resulting impact on display design. Method We developed a novel method to control visual attention in a driving simulator. Twenty experienced drivers sustained visual attention to in-vehicle HDDs and HUDs while driving in both a simple straight and empty roadway environment and a more realistic driving environment that included traffic and turns. Results In the realistic environment, but not the simpler environment, we found evidence of differing driving behaviors between display conditions, even though participants’ glance behavior was similar. Conclusion Thus, the assumption that visual attention can be evaluated in the same way for different types of vehicle displays may be inaccurate. Differences between driving environments bring the validity of testing HUDs using simplistic driving environments into question. Application As we move toward the integration of HUD user interfaces into vehicles, it is important that we develop new, sensitive assessment methods to ensure HUD interfaces are indeed safe for driving.more » « less
-
null (Ed.)When navigating via car, developing robust mental representations (spatial knowledge) of the environment is crucial in situations where technology fails, or we need to find locations not included in a navigation system’s database. In this work, we present a study that examines how screen-relative and world-relative augmented reality (AR) head-up display interfaces affect drivers’ glance behavior and spatial knowledge acquisition. Results showed that both AR interfaces have similar impact on the levels of spatial knowledge acquired. However, eye-tracking analyses showed fundamental differences in the way participants visually interacted with different AR interfaces; with conformal-graphics demanding more visual attention from drivers.more » « less
An official website of the United States government

