skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: The Design of Mid-Air Ultrasonic Haptic Interfaces Based on the Perception of Lines
Mid-air ultrasonic feedback is a new form of haptic stimulation supporting mid-air, touch-free user interfaces. Functional implementation of ultrasonic haptic (UH) interfaces depend upon the ability to accurately distinguish between the intensity, shape, orientation, and movement of a signal. This user study (N = 15) investigates the ability to non-visually perceive two ultrasonic lines with varying lengths (3, 5, and 7 cm) and orientations (vertical and horizontal) using the palm of the hand. Key results showed that: (1) the orientation of the lines had no effect on a user’s accuracy when determining their relative lengths, (2) line length distinction significantly improved when the length difference was at least 4 cm, and (3) a clear learning curve was evident when evaluating a new user’s ability to perceive ultrasonic signals. The capabilities of UH technology identified and discussed within this study will help engineer user-friendly and functional mid-air haptic interfaces for future applications.  more » « less
Award ID(s):
1910603
PAR ID:
10474937
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
AHFE International
Date Published:
Volume:
84
Page Range / eLocation ID:
18-26
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Ultrasonic haptic (UH) feedback employs mid-air ultrasound waves detectable by the palm of the hand. This interface demonstrates a novel opportunity to utilize non-visual input and output (I/O) functionalities in interactive applications, such as vehicle controls that allow the user to keep their eyes on the road. However, more work is needed to evaluate the useability of such an interface. In this study, 16 blindfolded participants completed tasks involving finding and counting UH buttons, associating buttons with audio cues, learning spatial arrangements, and determining button states. Results showed that users were generally successful with 2–4 arranged buttons and could associate them with audio cues with an average accuracy of 77.1%. Participants were also able to comprehend button spatial arrangements with 77.8% accuracy and engage in reconstruction tasks to prove user understanding. These results signify the capability of UH feedback to have real-world I/O functionality and serve to guide future exploration in this area. 
    more » « less
  2. Abstract This paper aims to present a potential cybersecurity risk existing in mixed reality (MR)-based smart manufacturing applications that decipher digital passwords through a single RGB camera to capture the user’s mid-air gestures. We first created a test bed, which is an MR-based smart factory management system consisting of mid-air gesture-based user interfaces (UIs) on a video see-through MR head-mounted display. To interact with UIs and input information, the user’s hand movements and gestures are tracked by the MR system. We setup the experiment to be the estimation of the password input by users through mid-air hand gestures on a virtual numeric keypad. To achieve this goal, we developed a lightweight machine learning-based hand position tracking and gesture recognition method. This method takes either video streaming or recorded video clips (taken by a single RGB camera in front of the user) as input, where the videos record the users’ hand movements and gestures but not the virtual UIs. With the assumption of the known size, position, and layout of the keypad, the machine learning method estimates the password through hand gesture recognition and finger position detection. The evaluation result indicates the effectiveness of the proposed method, with a high accuracy of 97.03%, 94.06%, and 83.83% for 2-digit, 4-digit, and 6-digit passwords, respectively, using real-time video streaming as input with known length condition. Under the unknown length condition, the proposed method reaches 85.50%, 76.15%, and 77.89% accuracy for 2-digit, 4-digit, and 6-digit passwords, respectively. 
    more » « less
  3. Regular user interface screens can display dense and detailed information to human users but miss out on providing somatosensory stimuli that take full advantage of human spatial cognition. Therefore, the development of new haptic displays can strengthen human-machine communication by augmenting visual communication with tactile stimulation needed to transform information from digital to spatial/physical environments. Shape-changing interfaces, such as pin arrays and robotic surfaces, are one method for providing this spatial dimension of feedback; however, these displays are often either limited in maximum extension or require bulky mechanical components. In this paper, we present a compact pneumatically actuated soft growing pin for inflatable haptic interfaces. Each pin consists of a rigid, air-tight chamber, an inflatable fabric pin, and a passive spring-actuated reel mechanism. The device behavior was experimentally characterized, showing extension to 18.5 cm with relatively low pressure input (1.75 psi, 12.01 kPa), and the behavior was compared to the mathematical model of soft growing robots. The results showed that the extension of the soft pin can be accurately modeled and controlled using pressure as input. Finally, we demonstrate the feasibility of implementing individually actuated soft growing pins to create an inflatable haptic surface. 
    more » « less
  4. Despite non-co-location, haptic stimulation at the wrist can potentially provide feedback regarding interactions at the fingertips without encumbering the user’s hand. Here we investigate how two types of skin deformation at the wrist (normal and shear) relate to the perception of the mechanical properties of virtual objects. We hypothesized that a congruent mapping (i.e. when the most relevant interaction forces during a virtual interaction spatially match the haptic feedback at the wrist) would result in better perception than other map- pings.We performed an experiment where haptic devices at the wrist rendered either normal or shear feedback during manipulation of virtual objects with varying stiffness, mass, or friction properties. Perception of mechanical properties was more accurate with congruent skin stimulation than noncongruent. In addition, discrimination performance and subjective reports were positively influenced by congruence. This study demonstrates that users can perceive mechanical properties via haptic feedback provided at the wrist with a consistent mapping between haptic feedback and interaction forces at the fingertips, regardless of congruence. 
    more » « less
  5. This work reports a platform based on ultrasound for mid-air particle manipulations using a 2×2 piezoelectric micromachined ultrasonic transducer (pMUT) array. Three achievements have been demonstrated as compared to the state-of-art: (1) high SPL (sound pressure level) of 120 dB at a distance 12 mm away by an individual lithium-niobate pMUT; (2) a numerically simulated and experimentally demonstrated 2D focal point control scheme by adjusting the phase-delay of individual pMUTs; and (3) the experimental demonstration of moving a 0.7 mg foam plastic particle of 12 mm away in the mid-air by ~1.8 mm. As such, this work shows the potential for practical applications in the broad fields of non-contact actuations, including particle manipulations in microfluidics, touchless haptic sensations, … etc. 
    more » « less