skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Virtual Guide Dog: Next-generation pedestrian signal for the visually impaired
Accessible pedestrian signal was proposed as a mean to achieve the same level of service that is set forth by the Americans with Disabilities Act for the visually impaired. One of the major issues of existing accessible pedestrian signals is the failure to deliver adequate crossing information for the visually impaired. This article presents a mobile-based accessible pedestrian signal application, namely, Virtual Guide Dog. Integrating intersection information and onboard sensors (e.g. GPS, compass, accelerometer, and gyroscope sensor) of modern smartphones, the Virtual Guide Dog application can notify the visually impaired: (1) the close proximity of an intersection and (2) the street information for crossing. By employing a screen tapping interface, Virtual Guide Dog can remotely place a pedestrian crossing call to the controller, without the need of using a pushbutton. In addition, Virtual Guide Dog informs VIs the start of a crossing phase using text-to-speech technology. The proof-of-concept test shows that Virtual Guide Dog keeps the users informed about the remaining distance as they are approaching the intersection. It was also found that the GPS-only mode is accompanied by greater distance deviation compared to the mode jointly operating with both GPS and cellular positioning.  more » « less
Award ID(s):
1844238
PAR ID:
10574613
Author(s) / Creator(s):
 ;  
Publisher / Repository:
SAGE Publications
Date Published:
Journal Name:
Advances in Mechanical Engineering
Volume:
12
Issue:
3
ISSN:
1687-8140
Format(s):
Medium: X Size: Article No. 168781401988309
Size(s):
Article No. 168781401988309
Sponsoring Org:
National Science Foundation
More Like this
  1. null; null; null; null (Ed.)
    For the significant global population of individuals who are blind or visually impaired, spatial awareness during navigation remains a challenge. Tactile Electronic Travel Aids have been designed to assist with the provision of spatiotemporal information, but an intuitive method for mapping this information to patterns on a vibrotactile display remains to be determined. This paper explores the encoding of distance from a navigator to an object using two strategies: absolute and relative. A wearable prototype, the HapBack, is presented with two straps of vertically aligned vibrotactile motors mapped to five distances, with each distance mapped to a row on the display. Absolute patterns emit a single vibration at the row corresponding to a distance, while relative patterns emit a sequence of vibrations starting from the bottom row and ending at the row mapped to that distance. These two encoding strategies are comparatively evaluated for identification accuracy and perceived intuitiveness of mapping among ten adult participants who are blind or visually impaired. No significant difference was found between the intuitiveness of the two encodings based on these metrics, with each showing promising results for application during navigation tasks. 
    more » « less
  2. W4A '23: Proceedings of the 20th International Web for All Conference April 2023 Pages 32–43 (Ed.)
    Despite growing interest in accessible texting for people who are blind and visually impaired (BVI), little is known about the practice of texting when on the move, and especially while using assistive technologies. To address this gap, we conducted an interview-based study with 20 BVIs who text while travelling. Our findings revealed that participants engage in text outside their home in four recurrent situations: walking to a destination, waiting for public transportation, riding in a vehicle, or approaching a point of interest. Moreover, to safely send a text, participants express the need for receiving a range of information about their surroundings, including the distance to destination, upcoming obstacles, traffic jams, and weather conditions. Based on these findings, we examine three modes of situational feedback cues to integrate with messaging applications: text-based, sound effects, and tactile. Our work discusses design directions to enhance the texting experience in nomadic contexts for people who are blind and visually impaired. 
    more » « less
  3. GPS accuracy is poor in indoor environments and around buildings. Thus, reading and following signs still remains the most common mechanism for providing and receiving wayfinding information in such spaces. This puts individuals who are blind or visually impaired (BVI) at a great disadvantage. This work designs, implements, and evaluates a wayfinding system and smartphone application called CityGuide that can be used by BVI individuals to navigate their surroundings beyond what is possible with just a GPS-based system. CityGuide enables an individual to query and get turn-by-turn shortest route directions from an indoor location to an outdoor location. CityGuide leverages recently developed indoor wayfinding solutions in conjunction with GPS signals to provide a seamless indoor-outdoor navigation and wayfinding system that guides a BVI individual to their desired destination through the shortest route. Evaluations of CityGuide with BVI human subjects navigating between an indoor starting point to an outdoor destination within an unfamiliar university campus scenario showed it to be effective in reducing end-to-end navigation times and distances of almost all participants. 
    more » « less
  4. null (Ed.)
    This work presents a novel prototype autonomous vehicle (AV) human-machine interface (HMI) in virtual reality (VR) that utilizes a human-like visual embodiment in the driver’s seat of an AV to communicate AV intent to pedestrians in a crosswalk scenario. There is currently a gap in understanding the use of virtual humans in AV HMIs for pedestrian crossing despite the demonstrated efcacy of human-like interfaces in improving human-machine relationships. We conduct a 3x2 within-subjects experiment in VR using our prototype to assess the efects of a virtual human visual embodiment AV HMI on pedestrian crossing behavior and experience. In the experiment participants walk across a virtual crosswalk in front of an AV. How long they took to decide to cross and how long it took for them to reach the other side were collected, in addition to their subjective preferences and feelings of safety. Of 26 participants, 25 preferred the condition with the most anthropomorphic features. An intermediate condition where a human-like virtual driver was present but did not exhibit any behaviors was least preferred and also had a signifcant efect on time to decide. This work contributes the frst empirical work on using human-like visual embodiments for AV HMIs. 
    more » « less
  5. Millions of children around the world learn to code by creating with Scratch and other block-based programming languages. However, these programming environments typically are not accessible for blind and visually impaired children to tinker, create, and learn alongside their sighted peers. This paper discusses the ongoing development of the OctoStudio coding app to support accessibility and tinkerability for blind and visually impaired learners. We discuss how we have applied core principles of tinkerability to create an accessible, mainstream app for use on mobile phones and tablets. We describe our iterative development process in collaboration with educators who specialize in the design and testing of accessible technologies for children. We conclude with suggestions for how the core principles of designing for tinkerability can be expanded to support accessibility and engagement of blind and visually impaired learners internationally. 
    more » « less