skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: A touch-based multimodal and cryptographic bio-human–machine interface
The awareness of individuals’ biological status is critical for creating interactive and adaptive environments that can actively assist the users to achieve optimal outcomes. Accordingly, specialized human–machine interfaces—equipped with bioperception and interpretation capabilities—are required. To this end, we devised a multimodal cryptographic bio-human–machine interface (CB-HMI), which seamlessly translates the user’s touch-based entries into encrypted biochemical, biophysical, and biometric indices. As its central component, the CB-HMI features thin hydrogel-coated chemical sensors and inference algorithms to noninvasively and inconspicuously acquire biochemical indices such as circulating molecules that partition onto the skin (here, ethanol and acetaminophen). Additionally, the CB-HMI hosts physical sensors and associated algorithms to simultaneously acquire the user’s heart rate, blood oxygen level, and fingerprint minutiae pattern. Supported by human subject studies, we demonstrated the CB-HMI’s capability in terms of acquiring physiologically relevant readouts of target bioindices, as well as user-identifying and biometrically encrypting/decrypting these indices in situ (leveraging the fingerprint feature). By upgrading the common surrounding objects with the CB-HMI, we created interactive solutions for driving safety and medication use. Specifically, we demonstrated a vehicle-activation system and a medication-dispensing system, where the integrated CB-HMI uniquely enabled user bioauthentication (on the basis of the user’s biological state and identity) prior to rendering the intended services. Harnessing the levels of bioperception achieved by the CB-HMI and other intelligent HMIs, we can equip our surroundings with a comprehensive and deep awareness of individuals’ psychophysiological state and needs.  more » « less
Award ID(s):
1722972
PAR ID:
10351133
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ;
Date Published:
Journal Name:
Proceedings of the National Academy of Sciences
Volume:
119
Issue:
15
ISSN:
0027-8424
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Wearable electronics revolutionize human–machine interfaces (HMIs) for robotic or prosthetic control. Yet, the challenge lies in eliminating conventional rigid and impermeable electronic components, such as batteries, while considering the comfort and usability of HMIs over prolonged periods. Herein, a self‐powered, flexible, and breathable HMI is developed based on piezoelectric sensors. This interface is designed to accurately monitor subtle changes in body and muscle movements, facilitating effective communication and control of robotic prosthetic hands for various applications. Utilizing engineered porous structures within the polymeric material, the piezoelectric sensor demonstrates a significantly enhanced sensitivity, flexibility, and permeability, highlighting its outstanding HMI applications. Furthermore, the developed control algorithm enables a single sensor to comprehensively control robotic hands. By successfully translating piezoelectric signals generated from bicep muscle movements into Morse Code, this HMI serves as an efficient communication device. Additionally, the process is demonstrated by illustrating the execution of the daily task of “drinking a cup of water” using the developed HMI to enable the control of a human‐interactive robotic prosthetic hand through the detection of bicep muscle movements. Such HMIs pave the way toward self‐powered and comfortable biomimetic systems, making a significant contribution to the future evolution of prosthetics. 
    more » « less
  2. Human–machine interfaces (HMI) are currently a trendy and rapidly expanding area of research. Interestingly, the human user does not readily observe the interface between humans and machines. Instead, interactions between the machine and electrical signals from the user's body are obscured by complex control algorithms. The result is effectively a one-way street, wherein data is only transmitted from human to machine. Thus, a gap remains in the literature: how can information be effectively conveyed to the user to enable mutual understanding between humans and machines? Here, this paper reviews recent advancements in biosignal-integrated wearable robotics, with a particular emphasis on “visualization”—the presentation of relevant data, statistics, and visual feedback to the user. This review article covers various signals of interest, such as electroencephalograms and electromyograms, and explores novel sensor architectures and key materials. Recent developments in wearable robotics are examined from control and mechanical design perspectives. Additionally, we discuss current visualization methods and outline the field's future direction. While much of the HMI field focuses on biomedical and healthcare applications, such as rehabilitation of spinal cord injury and stroke patients, this paper also covers less common applications in manufacturing, defense, and other domains. 
    more » « less
  3. Identification is the core of any authentication protocol design as the purpose of the authentication is to verify the user’s identity. The efficient establishment and verification of identity remain a big challenge. Recently, biometrics-based identification algorithms gained popularity as a means of identifying individuals using their unique biological characteristics. In this paper, we propose a novel and efficient identification framework, ActID, which can identify a user based on his/her hand motion while walking. ActID not only selects a set of high-quality features based on Optimal Feature Evaluation and Selection and Correlation-based Feature Selection algorithms but also includes a novel sliding window based voting classifier. Therefore, it achieves several important design goals for gait authentication based on resource-constrained devices, including lightweight and real-time classification, high identification accuracy, a minimum number of sensors, and a minimum amount of data collected. Performance evaluation shows that ActID is cost-effective and easily deployable, satisfies real-time requirements, and achieves a high identification accuracy of 100%. 
    more » « less
  4. A low-cost remote supervisory control capability is added to a packaging process, in which a low-voltage signal is used to communicate between a distant HMI control panel and a PLC network using the AC power line as a communication medium. The network is a star-topology and uses a Mater-slave protocol. Remote Supervisory control is achieved using a user-defined toolbox of control functions. In this system, a Programmable Logic Controller (PLC) is used to control a process and interface with the operator through a Human Machine Interface (HMI) Panel. A star topology ethernet network is used to connect the PLCs and the HMI panel. 
    more » « less
  5. We present a shared control paradigm that improves a user’s ability to operate complex, dynamic systems in potentially dangerous environments without a priori knowledge of the user’s objective. In this paradigm, the role of the autonomous partner is to improve the general safety of the system without constraining the user’s ability to achieve unspecified behaviors. Our approach relies on a data-driven,model-based representation of the joint human-machine system to evaluate, in parallel, a significant number of potential inputs that the user may wish to provide. These samples are used to (1)predict the safety of the system over a receding horizon, and (2)minimize the influence of the autonomous partner. The resulting shared control algorithm maximizes the authority allocated to the human partner to improve their sense of agency, while improving safety. We evaluate the efficacy of our shared control algorithm with a human subjects study (n=20) conducted in two simulated environments: a balance bot and a race car. During the experiment, users are free to operate each system however they would like (i.e., there is no specified task) and are only asked to try to avoid unsafe regions of the state space. Using modern computational resources (i.e., GPUs) our approach is able to consider more than 10,000 potential trajectories at each time step in a control loop running at 100Hz for the balance bot and 60Hzfor the race car. The results of the study show that our shared control paradigm improves system safety without knowledge of the user’s goal, while maintaining high-levels of user satisfaction and low-levels of frustration. Our code is available online athttps://github.com/asbroad/mpmisharedcontrol. 
    more » « less