skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Does it Press? Investigating the Efficacy of an Ultrasonic Haptic Button Interface for Non-Visual Driving Applications
Ultrasonic haptic (UH) feedback employs mid-air ultrasound waves detectable by the palm of the hand. This interface demonstrates a novel opportunity to utilize non-visual input and output (I/O) functionalities in interactive applications, such as vehicle controls that allow the user to keep their eyes on the road. However, more work is needed to evaluate the useability of such an interface. In this study, 16 blindfolded participants completed tasks involving finding and counting UH buttons, associating buttons with audio cues, learning spatial arrangements, and determining button states. Results showed that users were generally successful with 2–4 arranged buttons and could associate them with audio cues with an average accuracy of 77.1%. Participants were also able to comprehend button spatial arrangements with 77.8% accuracy and engage in reconstruction tasks to prove user understanding. These results signify the capability of UH feedback to have real-world I/O functionality and serve to guide future exploration in this area.  more » « less
Award ID(s):
1910603
PAR ID:
10474938
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
AHFE International
Date Published:
Volume:
95
Page Range / eLocation ID:
343-543
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We explore Spatial Augmented Reality (SAR) precues (predictive cues) for procedural tasks within and between workspaces and for visualizing multiple upcoming steps in advance. We designed precues based on several factors: cue type, color transparency, and multi-level (number of precues). Precues were evaluated in a procedural task requiring the user to press buttons in three surrounding workspaces. Participants performed fastest in conditions where tasks were linked with line cues with different levels of color transparency. Precue performance was also affected by whether the next task was in the same workspace or a different one. 
    more » « less
  2. Haptic feedback can provide operators of hand- held robots with active guidance during challenging tasks and with critical information on environment interactions. Yet for such haptic feedback to be effective, it must be lightweight, capable of integration into a hand-held form factor, and capable of displaying easily discernible cues. We present the design and evaluation of HaPPArray — a haptic pneumatic pouch array — where the pneumatic pouches can be actuated alone or in sequence to provide information to the user. A 3x3 array of pouches was integrated into a handle, representative of an interface for a hand-held robot. When actuated individually, users were able to correctly identify the pouch being actuated with 86% accuracy, and when actuated in sequence, users were able to correctly identify the associated direction cue with 89% accuracy. These results, along with a demonstration of how the direction cues can be used for haptic guidance of a medical robot, suggest that HaPPArray can be an effective approach for providing haptic feedback for hand-held robots. 
    more » « less
  3. Social VR has increased in popularity due to its affordances for rich, embodied, and nonverbal communication. However, nonverbal communication remains inaccessible for blind and low vision people in social VR. We designed accessible cues with audio and haptics to represent three nonverbal behaviors: eye contact, head shaking, and head nodding. We evaluated these cues in real-time conversation tasks where 16 blind and low vision participants conversed with two other users in VR. We found that the cues were effective in supporting conversations in VR. Participants had statistically significantly higher scores for accuracy and confidence in detecting attention during conversations with the cues than without. We also found that participants had a range of preferences and uses for the cues, such as learning social norms. We present design implications for handling additional cues in the future, such as the challenges of incorporating AI. Through this work, we take a step towards making interpersonal embodied interactions in VR fully accessible for blind and low vision people. 
    more » « less
  4. Virtual environments (VEs) can be infinitely large, but movement of the virtual reality (VR) user is constrained by the surrounding real environment. Teleporting has become a popular locomotion interface to allow complete exploration of the VE. To teleport, the user selects the intended position (and sometimes orientation) before being instantly transported to that location. However, locomotion interfaces such as teleporting can cause disorientation. This experiment explored whether practice and feedback when using the teleporting interface can reduce disorientation. VR headset owners participated remotely. On each trial of a triangle completion task, the participant traveled along two path legs through a VE before attempting to point to the path origin. Travel was completed with one of two teleporting interfaces that differed in the availability of rotational self-motion cues. Participants in the feedback condition received feedback about their pointing accuracy. For both teleporting interfaces tested, feedback caused significant improvement in pointing performance, and practice alone caused only marginal improvement. These results suggest that disorientation in VR can be reduced through feedback-based training. 
    more » « less
  5. Following tetraplegia, independence for completing essential daily tasks, such as opening doors and eating, significantly declines. Assistive robotic manipulators (ARMs) could restore independence, but typically input devices for these manipulators require functional use of the hands. We created and validated a hands-free multimodal input system for controlling an ARM in virtual reality using combinations of a gyroscope, eye-tracking, and heterologous surface electromyography (sEMG). These input modalities are mapped to ARM functions based on the user’s preferences and to maximize the utility of their residual volitional capabilities following tetraplegia. The two participants in this study with tetraplegia preferred to use the control mapping with sEMG button functions and disliked winking commands. Non-disabled participants were more varied in their preferences and performance, further suggesting that customizability is an advantageous component of the control system. Replacing buttons from a traditional handheld controller with sEMG did not substantively reduce performance. The system provided adequate control to all participants to complete functional tasks in virtual reality such as opening door handles, turning stove dials, eating, and drinking, all of which enable independence and improved quality of life for these individuals. 
    more » « less