In the future, roads will host a complex mix of automated and manually operated vehicles, along with vulnerable road users. However, most automotive user interfaces and human factors research focus on single-agent studies, where one human interacts with one vehicle. Only a few studies incorporate multi-agent setups. This workshop aims to (1) examine the current state of multi-agent research in the automotive domain, (2) serve as a platform for discussion toward more realistic multi-agent setups, and (3) discuss methods and practices to conduct such multi-agent research. The goal is to synthesize the insights from the AutoUI community, creating the foundation for advancing multi-agent traffic interaction research.
more »
« less
Exploring Holistic HMI Design for Automated Vehicles: Insights from a Participatory Workshop to Bridge In-Vehicle and External Communication
Human–Machine Interfaces (HMIs) for automated vehicles (AVs) are typically divided into two categories: internal HMIs for inter- actions within the vehicle, and external HMIs for communication with other road users. In this work, we examine the prospects of bridging these two seemingly distinct domains. Through a partici- patory workshop with automotive user interface researchers and practitioners, we facilitated a critical exploration of holistic HMI design by having workshop participants collaboratively develop interaction scenarios involving AVs, in-vehicle users, and external road users. The discussion o!ers insights into the escalation of interface elements as an HMI design strategy, the direct interac- tions between di!erent users, and an expanded understanding of holistic HMI design. This work re"ects a collaborative e!ort to understand the practical aspects of this holistic design approach, o!ering new perspectives and encouraging further investigation into this underexplored aspect of automotive user interfaces.
more »
« less
- Award ID(s):
- 2212431
- PAR ID:
- 10656925
- Publisher / Repository:
- ACM
- Date Published:
- Page Range / eLocation ID:
- 1 to 9
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Self-driving vehicles are the latest innovation in improving personal mobility and road safety by removing arguably error-prone humans from driving-related tasks. Such advances can prove especially beneficial for people who are blind or have low vision who cannot legally operate conventional motor vehicles. Missing from the related literature, we argue, are studies that describe strategies for vehicle design for these persons. We present a case study of the participatory design of a prototype for a self-driving vehicle human-machine interface (HMI) for a graduate-level course on inclusive design and accessible technology. We reflect on the process of working alongside a co-designer, a person with a visual disability, to identify user needs, define design ideas, and produce a low-fidelity prototype for the HMI. This paper may benefit researchers interested in using a similar approach for designing accessible autonomous vehicle technology. INTRODUCTION The rise of autonomous vehicles (AVs) may prove to be one of the most significant innovations in personal mobility of the past century. Advances in automated vehicle technology and advanced driver assistance systems (ADAS) specifically, may have a significant impact on road safety and a reduction in vehicle accidents (Brinkley et al., 2017; Dearen, 2018). According to the Department of Transportation (DoT), automated vehicles could help reduce road accidents caused by human error by as much as 94% (SAE International, n.d.). In addition to reducing traffic accidents and saving lives and property, autonomous vehicles may also prove to be of significant value to persons who cannot otherwise operate conventional motor vehicles. AVs may provide the necessary mobility, for instance, to help create new employment opportunities for nearly 40 million Americans with disabilities (Claypool et al., 2017; Guiding Eyes for the Blind, 2019), for instance. Advocates for the visually impaired specifically have expressed how “transformative” this technology can be for those who are blind or have significant low vision (Winter, 2015); persons who cannot otherwise legally operate a motor vehicle. While autonomous vehicles have the potential to break down transportationmore » « less
-
The awareness of individuals’ biological status is critical for creating interactive and adaptive environments that can actively assist the users to achieve optimal outcomes. Accordingly, specialized human–machine interfaces—equipped with bioperception and interpretation capabilities—are required. To this end, we devised a multimodal cryptographic bio-human–machine interface (CB-HMI), which seamlessly translates the user’s touch-based entries into encrypted biochemical, biophysical, and biometric indices. As its central component, the CB-HMI features thin hydrogel-coated chemical sensors and inference algorithms to noninvasively and inconspicuously acquire biochemical indices such as circulating molecules that partition onto the skin (here, ethanol and acetaminophen). Additionally, the CB-HMI hosts physical sensors and associated algorithms to simultaneously acquire the user’s heart rate, blood oxygen level, and fingerprint minutiae pattern. Supported by human subject studies, we demonstrated the CB-HMI’s capability in terms of acquiring physiologically relevant readouts of target bioindices, as well as user-identifying and biometrically encrypting/decrypting these indices in situ (leveraging the fingerprint feature). By upgrading the common surrounding objects with the CB-HMI, we created interactive solutions for driving safety and medication use. Specifically, we demonstrated a vehicle-activation system and a medication-dispensing system, where the integrated CB-HMI uniquely enabled user bioauthentication (on the basis of the user’s biological state and identity) prior to rendering the intended services. Harnessing the levels of bioperception achieved by the CB-HMI and other intelligent HMIs, we can equip our surroundings with a comprehensive and deep awareness of individuals’ psychophysiological state and needs.more » « less
-
null (Ed.)This work presents a novel prototype autonomous vehicle (AV) human-machine interface (HMI) in virtual reality (VR) that utilizes a human-like visual embodiment in the driver’s seat of an AV to communicate AV intent to pedestrians in a crosswalk scenario. There is currently a gap in understanding the use of virtual humans in AV HMIs for pedestrian crossing despite the demonstrated efcacy of human-like interfaces in improving human-machine relationships. We conduct a 3x2 within-subjects experiment in VR using our prototype to assess the efects of a virtual human visual embodiment AV HMI on pedestrian crossing behavior and experience. In the experiment participants walk across a virtual crosswalk in front of an AV. How long they took to decide to cross and how long it took for them to reach the other side were collected, in addition to their subjective preferences and feelings of safety. Of 26 participants, 25 preferred the condition with the most anthropomorphic features. An intermediate condition where a human-like virtual driver was present but did not exhibit any behaviors was least preferred and also had a signifcant efect on time to decide. This work contributes the frst empirical work on using human-like visual embodiments for AV HMIs.more » « less
-
Medical robotics has revolutionized healthcare by enhancing precision, adaptability, and clinical outcomes. This field has further evolved with the advent of human–machine interfaces (HMIs), which facilitate seamless interactions between users and robotic systems. However, traditional HMIs rely on rigid sensing components and bulky wiring, causing mechanical mismatches that limit user comfort, accuracy, and wearability. Flexible sensors offer a transformative solution by enabling the integration of adaptable sensing technology into HMIs, enhancing overall system functionality. Further integrating artificial intelligence (AI) into these systems addresses key limitations of conventional HMI, including challenges in complex data interpretations and multimodal sensing integration. In this review, we systematically explore the convergence of flexible sensor‐based HMIs and AI for medical robotics. Specifically, we analyze core flexible sensing mechanisms, AI‐driven advancements in healthcare, and applications in prosthetics, exoskeletons, and surgical robotics. By bridging the gap between flexible sensing technologies and AI‐driven intelligence, this review presents a roadmap for developing next‐generation smart medical robotic systems, advancing personalized healthcare and adaptive human–robot interactions.more » « less
An official website of the United States government

