Blind and low-vision (BLV) people rely on GPS-based systems for outdoor navigation. GPS's inaccuracy, however, causes them to veer off track, run into obstacles, and struggle to reach precise destinations. While prior work has made precise navigation possible indoors via hardware installations, enabling this outdoors remains a challenge. Interestingly, many outdoor environments are already instrumented with hardware such as street cameras. In this work, we explore the idea of repurposing *existing* street cameras for outdoor navigation. Our community-driven approach considers both technical and sociotechnical concerns through engagements with various stakeholders: BLV users, residents, business owners, and Community Board leadership. The resulting system, StreetNav, processes a camera's video feed using computer vision and gives BLV pedestrians real-time navigation assistance. Our evaluations show that StreetNav guides users more precisely than GPS, but its technical performance is sensitive to environmental occlusions and distance from the camera. We discuss future implications for deploying such systems at scale.
more »
« less
StreetNav: Leveraging street cameras to support precise outdoor navigation for blind pedestrians
Blind and low-vision (BLV) people rely on GPS-based systems for outdoor navigation. GPS's inaccuracy, however, causes them to veer off track, run into obstacles, and struggle to reach precise destinations. While prior work has made precise navigation possible indoors via hardware installations, enabling this outdoors remains a challenge. Interestingly, many outdoor environments are already instrumented with hardware such as street cameras. In this work, we explore the idea of repurposing existing street cameras for outdoor navigation. Our community-driven approach considers both technical and sociotechnical concerns through engagements with various stakeholders: BLV users, residents, business owners, and Community Board leadership. The resulting system, StreetNav, processes a camera's video feed using computer vision and gives BLV pedestrians real-time navigation assistance. Our evaluations show that StreetNav guides users more precisely than GPS, but its technical performance is sensitive to environmental occlusions and distance from the camera. We discuss future implications for deploying such systems at scale
more »
« less
- Award ID(s):
- 2038984
- PAR ID:
- 10545232
- Publisher / Repository:
- arXiv:2310.00491 [cs.HC], Oct. 2023
- Date Published:
- Format(s):
- Medium: X
- Location:
- arXiv:2310.00491 [cs.HC], Oct. 2023.
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Blind and low-vision (BLV) people rely on GPS-based systems for outdoor navigation. GPS’s inaccuracy, however, causes them to veer off track, run into obstacles, and struggle to reach precisedestinations. While prior work has made precise navigation possible indoors via hardware installations, enabling this outdoors remains a challenge. Interestingly, many outdoor environments are already instrumented with hardware such as street cameras. In this work, we explore the idea of repurposing existing street cameras for outdoor navigation. Our community-driven approach considers both technical and sociotechnical concerns through engagements with various stakeholders: BLV users, residents, business owners, and Community Board leadership. The resulting system, StreetNav, processes a camera’s video feed using computer vision and gives BLV pedestrians real-time navigation assistance. Our evaluations show that StreetNav guides users more precisely than GPS, but its technical performance is sensitive to environmental occlusions and distance from the camera. We discuss future implications for deploying such systems at scale.more » « less
-
There are a wide variety of mobile phone emergency response applications exist for both indoor and outdoor environments. However, outdoor applications mostly provide accident and navigation information to users, and indoor applications are limited to the unavailability of GPS positioning and WiFi access problems. This paper describes the proposed mobile augmented reality system (MARS) that allows both outdoor and indoor users to retrieve and manage information for emergency response and navigation that is spatially registered with the real world. The proposed MARS utilizes feature extraction for location sensing in indoor environments as during emergencies GPS and WiFi systems might not work. This paper describes the implementation of this MARS deployed on tablets and smartphones for building evacuation purposes. The MARS delivers critical evacuation information to smartphone users in the indoor environment and navigation information in the outdoor environments. A limited user study was conducted to test the effectiveness of the proposed MARS using the mobile phone usability questionnaire (MPUQ) framework. The results show that AR features were well integrated into the MARS and it will help identify the nearest exit in the building during the emergency evacuation.more » « less
-
This paper proposes an AR-based real-time mobile system for assistive indoor navigation with target segmentation (ARMSAINTS) for both sighted and blind or low-vision (BLV) users to safely explore and navigate in an indoor environment. The solution comprises four major components: graph construction, hybrid modeling, real-time navigation and target segmentation. The system utilizes an automatic graph construction method to generate a graph from a 2D floorplan and the Delaunay triangulation-based localization method to provide precise localization with negligible error. The 3D obstacle detection method integrates the existing capability of AR with a 2D object detector and a semantic target segmentation model to detect and track 3D bounding boxes of obstacles and people to increase BLV safety and understanding when traveling in the indoor environment. The entire system does not require the installation and maintenance of expensive infrastructure, run in real-time on a smartphone, and can easily adapt to environmental changes.more » « less
-
Social interactions often rely on the interpretation of visual cues. The inaccessibility of these nonverbal signals can relegate blind and low vision (BLV) people to the social periphery and restrict their independence in forming connections with others. Many assistive technologies for BLV people focus on spatial navigation or object recognition, but there is a gap in supporting “social wayfinding”: the capacity to perceive and navigate human interaction dynamics. To address this, we explore the design space of a wearable system that provides BLV users with real-time details about the social environment. The system, SocialCue, serves as a technology probe to explore BLV people’s preferences for social navigation assistance. We conducted a two-phase formative study which identified four social attributes that the system should communicate: identity, social availability, facial expression, and physical descriptions including clothing and hairstyle. We describe the implementation of SocialCue and close by discussing our future evaluation plans.more » « less
An official website of the United States government

