skip to main content

Title: Autonomous Vehicle Visual Embodiment for Pedestrian Interactions in Crossing Scenarios: Virtual Drivers in AVs for Pedestrian Crossing
This work presents a novel prototype autonomous vehicle (AV) human-machine interface (HMI) in virtual reality (VR) that utilizes a human-like visual embodiment in the driver’s seat of an AV to communicate AV intent to pedestrians in a crosswalk scenario. There is currently a gap in understanding the use of virtual humans in AV HMIs for pedestrian crossing despite the demonstrated efcacy of human-like interfaces in improving human-machine relationships. We conduct a 3x2 within-subjects experiment in VR using our prototype to assess the efects of a virtual human visual embodiment AV HMI on pedestrian crossing behavior and experience. In the experiment participants walk across a virtual crosswalk in front of an AV. How long they took to decide to cross and how long it took for them to reach the other side were collected, in addition to their subjective preferences and feelings of safety. Of 26 participants, 25 preferred the condition with the most anthropomorphic features. An intermediate condition where a human-like virtual driver was present but did not exhibit any behaviors was least preferred and also had a signifcant efect on time to decide. This work contributes the frst empirical work on using human-like visual embodiments for AV HMIs.
Authors:
; ; ; ;
Award ID(s):
1800961
Publication Date:
NSF-PAR ID:
10275677
Journal Name:
Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems
Page Range or eLocation-ID:
1 to 7
Sponsoring Org:
National Science Foundation
More Like this
  1. In this work, we investigate the influence that audio and visual feedback have on a manipulation task in virtual reality (VR). Without the tactile feedback of a controller, grasping virtual objects using one’s hands can result in slower interactions because it may be unclear to the user that a grasp has occurred. Providing alternative feedback, such as visual or audio cues, may lead to faster and more precise interactions, but might also affect user preference and perceived ownership of the virtual hands. In this study, we test four feedback conditions for virtual grasping. Three of the conditions provide feedback formore »when a grasp or release occurs, either visual, audio, or both, and one provides no feedback for these occurrences. We analyze the effect each feedback condition has on interaction performance, measure their effect on the perceived ownership of the virtual hands, and gauge user preference. In an experiment, users perform a pick-and-place task with each feedback condition. We found that audio feedback for grasping is preferred over visual feedback even though it seems to decrease grasping performance, and found that there were little to no differences in ownership between our conditions.« less
  2. Accessible pedestrian signal was proposed as a mean to achieve the same level of service that is set forth by the Americans with Disabilities Act for the visually impaired. One of the major issues of existing accessible pedestrian signals is the failure to deliver adequate crossing information for the visually impaired. This article presents a mobile-based accessible pedestrian signal application, namely, Virtual Guide Dog. Integrating intersection information and onboard sensors (e.g. GPS, compass, accelerometer, and gyroscope sensor) of modern smartphones, the Virtual Guide Dog application can notify the visually impaired: (1) the close proximity of an intersection and (2) themore »street information for crossing. By employing a screen tapping interface, Virtual Guide Dog can remotely place a pedestrian crossing call to the controller, without the need of using a pushbutton. In addition, Virtual Guide Dog informs VIs the start of a crossing phase using text-to-speech technology. The proof-of-concept test shows that Virtual Guide Dog keeps the users informed about the remaining distance as they are approaching the intersection. It was also found that the GPS-only mode is accompanied by greater distance deviation compared to the mode jointly operating with both GPS and cellular positioning.« less
  3. Abstract Microassembly systems utilizing precision robotics have long been used for realizing three-dimensional microstructures such as microsystems and microrobots. Prior to assembly, microscale components are fabricated using micro-electromechanical-system (MEMS) technology. The microassembly system then directs a microgripper through a series of automated or human-controlled pick-and-place operations. In this paper, we describe a novel custom microassembly system, named NEXUS, that can be used to prototype MEMS microrobots. The NEXUS integrates multi-degrees-of-freedom (DOF) precision positioners, microscope computer vision, and microscale process tools such as a microgripper and vacuum tip. A semi-autonomous human–machine interface (HMI) was programmed to allow the operator to interactmore »with the microassembly system. The NEXUS human–machine interface includes multiple functions, such as positioning, target detection, visual servoing, and inspection. The microassembly system's HMI was used by operators to assemble various three-dimensional microrobots such as the Solarpede, a novel light-powered stick-and-slip mobile microcrawler. Experimental results are reported in this paper to evaluate the system's semi-autonomous capabilities in terms of assembly rate and yield and compare them to purely teleoperated assembly performance. Results show that the semi-automated capabilities of the microassembly system's HMI offer a more consistent assembly rate of microrobot components and are less reliant on the operator's experience and skill.« less
  4. Redirected and amplified head movements have the potential to provide more natural interaction with virtual environments (VEs) than using controller-based input, which causes large discrepancies between visual and vestibular self-motion cues and leads to increased VR sickness. However, such amplified head movements may also exacerbate VR sickness symptoms over no amplification. Several general methods have been introduced to reduce VR sickness for controller-based input inside a VE, including a popular vignetting method that gradually reduces the field of view. In this paper, we investigate the use of vignetting to reduce VR sickness when using amplified head rotations instead of controllerbasedmore »input. We also investigate whether the induced VR sickness is a result of the user’s head acceleration or velocity by introducing two different modes of vignetting, one triggered by acceleration and the other by velocity. Our dependent measures were pre and post VR sickness questionnaires as well as estimated discomfort levels that were assessed each minute of the experiment. Our results show interesting effects between a baseline condition without vignetting, as well as the two vignetting methods, generally indicating that the vignetting methods did not succeed in reducing VR sickness for most of the participants and, instead, lead to a significant increase. We discuss the results and potential explanations of our findings.« less
  5. Virtual reality games have grown rapidly in popularity since the first consumer VR head-mounted displays were released in 2016, however comparatively little research has explored how this new medium impacts the experience of players. In this paper, we present a study exploring how user experience changes when playing Minecraft on the desktop and in immersive virtual reality. Fourteen players completed six 45 minute sessions, three played on the desktop and three in VR. The Gaming Experience Questionnaire, the i-Group presence questionnaire, and the Simulator Sickness Questionnaire were administered after each session, and players were interviewed at the end of themore »experiment. Participants strongly preferred playing Minecraft in VR, despite frustrations with using teleporation as a travel technique and feelings of simulator sickness. Players enjoyed using motion controls, but still continued to use indirect input under certain circumstances. This did not appear to negatively impact feelings of presence. We conclude with four lessons for game developers interested in porting their games to virtual reality.« less