skip to main content


Title: The HapBack: Evaluation of Absolute and Relative Distance Encoding to Enhance Spatial Awareness in a Wearable Tactile Device
For the significant global population of individuals who are blind or visually impaired, spatial awareness during navigation remains a challenge. Tactile Electronic Travel Aids have been designed to assist with the provision of spatiotemporal information, but an intuitive method for mapping this information to patterns on a vibrotactile display remains to be determined. This paper explores the encoding of distance from a navigator to an object using two strategies: absolute and relative. A wearable prototype, the HapBack, is presented with two straps of vertically aligned vibrotactile motors mapped to five distances, with each distance mapped to a row on the display. Absolute patterns emit a single vibration at the row corresponding to a distance, while relative patterns emit a sequence of vibrations starting from the bottom row and ending at the row mapped to that distance. These two encoding strategies are comparatively evaluated for identification accuracy and perceived intuitiveness of mapping among ten adult participants who are blind or visually impaired. No significant difference was found between the intuitiveness of the two encodings based on these metrics, with each showing promising results for application during navigation tasks.  more » « less
Award ID(s):
1828010
NSF-PAR ID:
10277360
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
Lecture notes in computer science
Volume:
12426 LNCS
ISSN:
1611-3349
Page Range / eLocation ID:
251-266
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Given that most cues exchanged during a social interaction are nonverbal (e.g., facial expressions, hand gestures, body language), individuals who are blind are at a social disadvantage compared to their sighted peers. Very little work has explored sensory augmentation in the context of social assistive aids for individuals who are blind. The purpose of this study is to explore the following questions related to visual-to-vibrotactile mapping of facial action units (the building blocks of facial expressions): (1) How well can individuals who are blind recognize tactile facial action units compared to those who are sighted? (2) How well can individuals who are blind recognize emotions from tactile facial action units compared to those who are sighted? These questions are explored in a preliminary pilot test using absolute identification tasks in which participants learn and recognize vibrotactile stimulations presented through the Haptic Chair, a custom vibrotactile display embedded on the back of a chair. Study results show that individuals who are blind are able to recognize tactile facial action units as well as those who are sighted. These results hint at the potential for tactile facial action units to augment and expand access to social interactions for individuals who are blind. 
    more » « less
  2. null (Ed.)
    The iASSIST is an iPhone-based assistive sensor solution for independent and safe travel for people who are blind or visually impaired, or those who simply face challenges in navigating an unfamiliar indoor environment. The solution integrates information of Bluetooth beacons, data connectivity, visual models, and user preferences. Hybrid models of interiors are created in a modeling stage with these multimodal data, collected, and mapped to the floor plan as the modeler walks through the building. Client-server architecture allows scaling to large areas by lazy-loading models according to beacon signals and/or adjacent region proximity. During the navigation stage, a user with the navigation app is localized within the floor plan, using visual, connectivity, and user preference data, along an optimal route to their destination. User interfaces for both modeling and navigation use multimedia channels, including visual, audio, and haptic feedback for targeted users. The design of human subject test experiments is also described, in addition to some preliminary experimental results. 
    more » « less
  3. null (Ed.)
    Though virtual reality (VR) has been advanced to certain levels of maturity in recent years, the general public, especially the population of the blind and visually impaired (BVI), still cannot enjoy the benefit provided by VR. Current VR accessibility applications have been developed either on expensive head-mounted displays or with extra accessories and mechanisms, which are either not accessible or inconvenient for BVI individuals. In this paper, we present a mobile VR app that enables BVI users to access a virtual environment on an iPhone in order to build their skills of perception and recognition of the virtual environment and the virtual objects in the environment. The app uses the iPhone on a selfie stick to simulate a long cane in VR, and applies Augmented Reality (AR) techniques to track the iPhone’s real-time poses in an empty space of the real world, which is then synchronized to the long cane in the VR environment. Due to the use of mixed reality (the integration of VR & AR), we call it the Mixed Reality cane (MR Cane), which provides BVI users auditory and vibrotactile feedback whenever the virtual cane comes in contact with objects in VR. Thus, the MR Cane allows BVI individuals to interact with the virtual objects and identify approximate sizes and locations of the objects in the virtual environment. We performed preliminary user studies with blind-folded participants to investigate the effectiveness of the proposed mobile approach and the results indicate that the proposed MR Cane could be effective to help BVI individuals in understanding the interaction with virtual objects and exploring 3D virtual environments. The MR Cane concept can be extended to new applications of navigation, training and entertainment for BVI individuals without more significant efforts. 
    more » « less
  4. Blind & visually impaired individuals often face challenges in wayfinding in unfamiliar environments. Thus, an accessible indoor positioning and navigation system that safely and accurately positions and guides such individuals would be welcome. In indoor positioning, both Bluetooth Low Energy (BLE) beacons and Google Tango have their individual strengths but also have weaknesses that can affect the overall usability of a system that solely relies on either component. We propose a hybrid positioning and navigation system that combines both BLE beacons and Google Tango in order to tap into their strengths while minimizing their individual weaknesses. In this paper, we will discuss the approach and implementation of a BLE- and Tango-based hybrid system. The results of pilot tests on the individual components and a human subject test on the full BLE and hybrid systems are also presented. In addition, we have explored the use of vibrotactile devices to provide additional information to a user about their surroundings. 
    more » « less
  5. Disabled people are in a dire need of electronic assistance. There are a couple of electronic assisting devices that bring their life to an easier turn. In this paper, we describe the design and implementation of a personal assistant robot for blind people. Visually impaired people need such personal assistant devices for they provide a real-time assistance regarding any necessary problem that blind people face. Some of those main problems are navigation in the indoors, identifying objects around unless getting a physical sense of those objects and sensing the surrounding with the distance of multiple objects. Our paper discusses the various application targeting features like using the LIDAR for local mapping, using a 3D camera for understanding the depth of the surrounding so that the person understands the distance and other information of the objects around. This design has been experimentally validated and required observations are posted in this paper. 
    more » « less