Human–Machine Interfaces (HMIs) for automated vehicles (AVs) are typically divided into two categories: internal HMIs for inter- actions within the vehicle, and external HMIs for communication with other road users. In this work, we examine the prospects of bridging these two seemingly distinct domains. Through a partici- patory workshop with automotive user interface researchers and practitioners, we facilitated a critical exploration of holistic HMI design by having workshop participants collaboratively develop interaction scenarios involving AVs, in-vehicle users, and external road users. The discussion o!ers insights into the escalation of interface elements as an HMI design strategy, the direct interac- tions between di!erent users, and an expanded understanding of holistic HMI design. This work re"ects a collaborative e!ort to understand the practical aspects of this holistic design approach, o!ering new perspectives and encouraging further investigation into this underexplored aspect of automotive user interfaces.
more »
« less
Inclusion in human–machine interactions
Human–machine interactions (HMIs) describe how humans engage various systems, including those that are smart, autonomous, or both. Most HMIs either allow the human to control the machine (an instrument panel), allow the machine to obtain data (a heart monitor), or even both (a virtual reality setup). HMIs may be placed in three broad classes. In one class, the individual is active in the interaction—that is, the individual is the user or purchaser of a technology such as an automobile. In another class, the user is passive but consenting in the interaction—that is, the interaction occurs with their consent, such as the use of devices for medical diagnosis. There is also a class in which the user is passive and nonconsenting in the interaction, such as the use of facial recognition for law enforcement purposes.
more »
« less
- Award ID(s):
- 1836952
- PAR ID:
- 10344295
- Date Published:
- Journal Name:
- Science
- Volume:
- 375
- Issue:
- 6577
- ISSN:
- 0036-8075
- Page Range / eLocation ID:
- 149 to 150
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Innovative human–machine interfaces (HMIs) have attracted increasing attention in the field of system control and assistive devices for disabled people. Conventional HMIs that are designed based on the interaction of physical movements or language communication are not effective or appliable to severely disabled users. Here, a breath‐driven triboelectric sensor is reported consisting of a soft fixator and two circular‐shaped triboelectric nanogenerators (TENGs) for self‐powered respiratory monitoring and smart system control. The sensor device is capable of effectively detecting the breath variation and generates responsive electrical signals based on different breath patterns without affecting the normal respiration. A breathing‐driven HMI system is demonstrated for severely disabled people to control electrical household appliances and shows an intelligent respiration monitoring system for emergence alarm. The new system provides the advantages of high sensitivity, good stability, low cost, and ease of use. This work will not only expand the development of the TENGs in self‐powered sensors, but also opens a new avenue to develop assistive devices for disabled people through innovation of advanced HMIs.more » « less
-
Medical robotics has revolutionized healthcare by enhancing precision, adaptability, and clinical outcomes. This field has further evolved with the advent of human–machine interfaces (HMIs), which facilitate seamless interactions between users and robotic systems. However, traditional HMIs rely on rigid sensing components and bulky wiring, causing mechanical mismatches that limit user comfort, accuracy, and wearability. Flexible sensors offer a transformative solution by enabling the integration of adaptable sensing technology into HMIs, enhancing overall system functionality. Further integrating artificial intelligence (AI) into these systems addresses key limitations of conventional HMI, including challenges in complex data interpretations and multimodal sensing integration. In this review, we systematically explore the convergence of flexible sensor‐based HMIs and AI for medical robotics. Specifically, we analyze core flexible sensing mechanisms, AI‐driven advancements in healthcare, and applications in prosthetics, exoskeletons, and surgical robotics. By bridging the gap between flexible sensing technologies and AI‐driven intelligence, this review presents a roadmap for developing next‐generation smart medical robotic systems, advancing personalized healthcare and adaptive human–robot interactions.more » « less
-
Abstract Wearable electronics revolutionize human–machine interfaces (HMIs) for robotic or prosthetic control. Yet, the challenge lies in eliminating conventional rigid and impermeable electronic components, such as batteries, while considering the comfort and usability of HMIs over prolonged periods. Herein, a self‐powered, flexible, and breathable HMI is developed based on piezoelectric sensors. This interface is designed to accurately monitor subtle changes in body and muscle movements, facilitating effective communication and control of robotic prosthetic hands for various applications. Utilizing engineered porous structures within the polymeric material, the piezoelectric sensor demonstrates a significantly enhanced sensitivity, flexibility, and permeability, highlighting its outstanding HMI applications. Furthermore, the developed control algorithm enables a single sensor to comprehensively control robotic hands. By successfully translating piezoelectric signals generated from bicep muscle movements into Morse Code, this HMI serves as an efficient communication device. Additionally, the process is demonstrated by illustrating the execution of the daily task of “drinking a cup of water” using the developed HMI to enable the control of a human‐interactive robotic prosthetic hand through the detection of bicep muscle movements. Such HMIs pave the way toward self‐powered and comfortable biomimetic systems, making a significant contribution to the future evolution of prosthetics.more » « less
-
null (Ed.)This work presents a novel prototype autonomous vehicle (AV) human-machine interface (HMI) in virtual reality (VR) that utilizes a human-like visual embodiment in the driver’s seat of an AV to communicate AV intent to pedestrians in a crosswalk scenario. There is currently a gap in understanding the use of virtual humans in AV HMIs for pedestrian crossing despite the demonstrated efcacy of human-like interfaces in improving human-machine relationships. We conduct a 3x2 within-subjects experiment in VR using our prototype to assess the efects of a virtual human visual embodiment AV HMI on pedestrian crossing behavior and experience. In the experiment participants walk across a virtual crosswalk in front of an AV. How long they took to decide to cross and how long it took for them to reach the other side were collected, in addition to their subjective preferences and feelings of safety. Of 26 participants, 25 preferred the condition with the most anthropomorphic features. An intermediate condition where a human-like virtual driver was present but did not exhibit any behaviors was least preferred and also had a signifcant efect on time to decide. This work contributes the frst empirical work on using human-like visual embodiments for AV HMIs.more » « less
An official website of the United States government

