skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Inclusion in human–machine interactions
Human–machine interactions (HMIs) describe how humans engage various systems, including those that are smart, autonomous, or both. Most HMIs either allow the human to control the machine (an instrument panel), allow the machine to obtain data (a heart monitor), or even both (a virtual reality setup). HMIs may be placed in three broad classes. In one class, the individual is active in the interaction—that is, the individual is the user or purchaser of a technology such as an automobile. In another class, the user is passive but consenting in the interaction—that is, the interaction occurs with their consent, such as the use of devices for medical diagnosis. There is also a class in which the user is passive and nonconsenting in the interaction, such as the use of facial recognition for law enforcement purposes.  more » « less
Award ID(s):
1836952
PAR ID:
10344295
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Science
Volume:
375
Issue:
6577
ISSN:
0036-8075
Page Range / eLocation ID:
149 to 150
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Innovative human–machine interfaces (HMIs) have attracted increasing attention in the field of system control and assistive devices for disabled people. Conventional HMIs that are designed based on the interaction of physical movements or language communication are not effective or appliable to severely disabled users. Here, a breath‐driven triboelectric sensor is reported consisting of a soft fixator and two circular‐shaped triboelectric nanogenerators (TENGs) for self‐powered respiratory monitoring and smart system control. The sensor device is capable of effectively detecting the breath variation and generates responsive electrical signals based on different breath patterns without affecting the normal respiration. A breathing‐driven HMI system is demonstrated for severely disabled people to control electrical household appliances and shows an intelligent respiration monitoring system for emergence alarm. The new system provides the advantages of high sensitivity, good stability, low cost, and ease of use. This work will not only expand the development of the TENGs in self‐powered sensors, but also opens a new avenue to develop assistive devices for disabled people through innovation of advanced HMIs. 
    more » « less
  2. null (Ed.)
    This work presents a novel prototype autonomous vehicle (AV) human-machine interface (HMI) in virtual reality (VR) that utilizes a human-like visual embodiment in the driver’s seat of an AV to communicate AV intent to pedestrians in a crosswalk scenario. There is currently a gap in understanding the use of virtual humans in AV HMIs for pedestrian crossing despite the demonstrated efcacy of human-like interfaces in improving human-machine relationships. We conduct a 3x2 within-subjects experiment in VR using our prototype to assess the efects of a virtual human visual embodiment AV HMI on pedestrian crossing behavior and experience. In the experiment participants walk across a virtual crosswalk in front of an AV. How long they took to decide to cross and how long it took for them to reach the other side were collected, in addition to their subjective preferences and feelings of safety. Of 26 participants, 25 preferred the condition with the most anthropomorphic features. An intermediate condition where a human-like virtual driver was present but did not exhibit any behaviors was least preferred and also had a signifcant efect on time to decide. This work contributes the frst empirical work on using human-like visual embodiments for AV HMIs. 
    more » « less
  3. The awareness of individuals’ biological status is critical for creating interactive and adaptive environments that can actively assist the users to achieve optimal outcomes. Accordingly, specialized human–machine interfaces—equipped with bioperception and interpretation capabilities—are required. To this end, we devised a multimodal cryptographic bio-human–machine interface (CB-HMI), which seamlessly translates the user’s touch-based entries into encrypted biochemical, biophysical, and biometric indices. As its central component, the CB-HMI features thin hydrogel-coated chemical sensors and inference algorithms to noninvasively and inconspicuously acquire biochemical indices such as circulating molecules that partition onto the skin (here, ethanol and acetaminophen). Additionally, the CB-HMI hosts physical sensors and associated algorithms to simultaneously acquire the user’s heart rate, blood oxygen level, and fingerprint minutiae pattern. Supported by human subject studies, we demonstrated the CB-HMI’s capability in terms of acquiring physiologically relevant readouts of target bioindices, as well as user-identifying and biometrically encrypting/decrypting these indices in situ (leveraging the fingerprint feature). By upgrading the common surrounding objects with the CB-HMI, we created interactive solutions for driving safety and medication use. Specifically, we demonstrated a vehicle-activation system and a medication-dispensing system, where the integrated CB-HMI uniquely enabled user bioauthentication (on the basis of the user’s biological state and identity) prior to rendering the intended services. Harnessing the levels of bioperception achieved by the CB-HMI and other intelligent HMIs, we can equip our surroundings with a comprehensive and deep awareness of individuals’ psychophysiological state and needs. 
    more » « less
  4. Abstract BackgroundHuman-human (HH) interaction mediated by machines (e.g., robots or passive sensorized devices), which we call human-machine-human (HMH) interaction, has been studied with increasing interest in the last decade. The use of machines allows the implementation of different forms of audiovisual and/or physical interaction in dyadic tasks. HMH interaction between two partners can improve the dyad’s ability to accomplish a joint motor task (task performance) beyond either partner’s ability to perform the task solo. It can also be used to more efficiently train an individual to improve their solo task performance (individual motor learning). We review recent research on the impact of HMH interaction on task performance and individual motor learning in the context of motor control and rehabilitation, and we propose future research directions in this area. MethodsA systematic search was performed on the Scopus, IEEE Xplore, and PubMed databases. The search query was designed to find studies that involve HMH interaction in motor control and rehabilitation settings. Studies that do not investigate the effect of changing the interaction conditions were filtered out. Thirty-one studies met our inclusion criteria and were used in the qualitative synthesis. ResultsStudies are analyzed based on their results related to the effects of interaction type (e.g., audiovisual communication and/or physical interaction), interaction mode (collaborative, cooperative, co-active, and competitive), and partner characteristics. Visuo-physical interaction generally results in better dyadic task performance than visual interaction alone. In cases where the physical interaction between humans is described by a spring, there are conflicting results as to the effect of the stiffness of the spring. In terms of partner characteristics, having a more skilled partner improves dyadic task performance more than having a less skilled partner. However, conflicting results were observed in terms of individual motor learning. ConclusionsAlthough it is difficult to draw clear conclusions as to which interaction type, mode, or partner characteristic may lead to optimal task performance or individual motor learning, these results show the possibility for improved outcomes through HMH interaction. Future work that focuses on selecting the optimal personalized interaction conditions and exploring their impact on rehabilitation settings may facilitate the transition of HMH training protocols to clinical implementations. 
    more » « less
  5. Conventional bulky and rigid electronics prevents compliant interfacing with soft human skin for health monitoring and human-machine interaction, due to the incompatible mechanical characteristics. To overcome the limitations, soft skin-mountable electronics with superior mechanical softness, flexibility, and stretchability provides an effective platform for intimate interaction with humans. In addition, soft electronics offers comfortability when worn on the soft, curvilinear, and dynamic human skin. In this review, recent advances in soft electronics as health monitors and human-machine interfaces (HMIs) are briefly discussed. Strategies to achieve softness in soft electronics including structural designs, material innovations, and approaches to optimize the interface between human skin and soft electronics are briefly reviewed. Characteristics and performances of soft electronic devices for health monitoring, including temperature sensors, pressure sensors for pulse monitoring, pulse oximeters, electrophysiological sensors, and sweat sensors, exemplify their wide range of utility. Furthermore, we review the soft devices for prosthetic limb, household object, mobile machine, and virtual object control to highlight the current and potential implementations of soft electronics for a broad range of HMI applications. This review concludes with a discussion on the current limitations and future opportunities of soft skin-mountable electronics. 
    more » « less