skip to main content

Search for: All records

Award ID contains: 1137172

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Brain damage or disruption to the primary visual cortex sometimes produces blindsight, a striking condition in which patients lose the ability to consciously detect visual information yet retain the ability to discriminate some attributes without awareness. Although there have been few demonstrations of somatosensory equivalents of blindsight, the lesions that produce “numbsense,” in which patients can make accurate guesses about tactile information without awareness, have been rare and localized to different regions of the brain. Despite transient loss of tactile awareness in the contralateral hand after transcranial magnetic stimulation (TMS) of the primary somatosensory cortex but not TMS of a control site, 12 participants (six female) reliably performed at above-chance levels on a localization task. These results demonstrating TMS-induced numbsense implicate a parallel somatosensory pathway that processes the location of touch in the absence of awareness and highlight the importance of primary sensory cortices for conscious perception.
  2. This paper describes the interface and testing of an indoor navigation app - ASSIST - that guides blind & visually impaired (BVI) individuals through an indoor environment with high accuracy while augmenting their understanding of the surrounding environment. ASSIST features personalized inter-faces by considering the unique experiences that BVI individuals have in indoor wayfinding and offers multiple levels of multimodal feedback. After an overview of the technical approach and implementation of the first prototype of the ASSIST system, the results of two pilot studies performed with BVI individuals are presented. Our studies show that ASSIST is useful in providing users with navigational guidance, improving their efficiency and (more significantly) their safety and accuracy in wayfinding indoors.
  3. This paper describes the interface and testing of an indoor navigation app - ASSIST - that guides blind & visually impaired (BVI) individuals through an indoor environment with high accuracy while augmenting their understanding of the surrounding environment. ASSIST features personalized interfaces by considering the unique experiences that BVI individuals have in indoor wayfinding and offers multiple levels of multimodal feedback. After an overview of the technical approach and implementation of the first prototype of the ASSIST system, the results of two pilot studies performed with BVI individuals are presented – a performance study to collect data on mobility (walking speed, collisions, and navigation errors) while using the app, and a usability study to collect user evaluation data on the perceived helpfulness, safety, ease-of-use, and overall experience while using the app. Our studies show that ASSIST is useful in providing users with navigational guidance, improving their efficiency and (more significantly) their safety and accuracy in wayfinding indoors. Findings and user feed-back from the studies confirm some of the previous results, while also providing some new insights into the creation of such an app, including the use of customized user interfaces and expanding the types of information provided.
  4. Blind & visually impaired (BVI) individuals and those with Autism Spectrum Disorder (ASD) each face unique challenges in navigating unfamiliar indoor environments. In this paper, we propose an indoor positioning and navigation system that guides a user from point A to point B indoors with high accuracy while augmenting their situational awareness. This system has three major components: location recognition (a hybrid indoor localization app that uses Bluetooth Low Energy beacons and Google Tango to provide high accuracy), object recognition (a body-mounted camera to provide the user momentary situational awareness of objects and people), and semantic recognition (map-based annotations to alert the user of static environmental characteristics). This system also features personalized interfaces built upon the unique experiences that both BVI and ASD individuals have in indoor wayfinding and tailors its multimodal feedback to their needs. Here, the technical approach and implementation of this system are discussed, and the results of human subject tests with both BVI and ASD individuals are presented. In addition, we discuss and show the system’s user-centric interface and present points for future work and expansion.
  5. In this work, we use a generative adversarial network (GAN) to train crowd counting networks using minimal data. We describe how GAN objectives can be modified to allow for the use of unlabeled data to benefit inference training in semi-supervised learning. More generally, we explain how these same methods can be used in more generic multiple regression target semi-supervised learning, with crowd counting being a demonstrative example. Given a convolutional neural network (CNN) with capabilities equivalent to the discriminator in the GAN, we provide experimental results which show that our GAN is able to outperform the CNN even when the CNN has access to significantly more labeled data. This presents the potential of training such networks to high accuracy with little data. Our primary goal is not to outperform the state-of-the-art using an improved method on the entire dataset, but instead we work to show that through semi-supervised learning we can reduce the data required to train an inference network to a given accuracy. To this end, systematic experiments are performed with various numbers of images and cameras to show under which situations the semi-supervised GANs can improve results.
  6. Blind & visually impaired individuals often face challenges in wayfinding in unfamiliar environments. Thus, an accessible indoor positioning and navigation system that safely and accurately positions and guides such individuals would be welcome. In indoor positioning, both Bluetooth Low Energy (BLE) beacons and Google Tango have their individual strengths but also have weaknesses that can affect the overall usability of a system that solely relies on either component. We propose a hybrid positioning and navigation system that combines both BLE beacons and Google Tango in order to tap into their strengths while minimizing their individual weaknesses. In this paper, we will discuss the approach and implementation of a BLE- and Tango-based hybrid system. The results of pilot tests on the individual components and a human subject test on the full BLE and hybrid systems are also presented. In addition, we have explored the use of vibrotactile devices to provide additional information to a user about their surroundings.
  7. Large transportation hubs are difficult to navigate, especially for people with special needs such as those with visual impairment, Autism spectrum disorder (ASD), or simply those with navigation challenges. The primary objective of this research is to design and develop a novel cyber-physical infrastructure that can effectively and efficiently transform existing transportation hubs into smart facilities capable of providing better location-aware services. We investigated the integration of a number of Internet of Things (IoT) elements, including video analytics, Bluetooth beacons, mobile computing, and facility semantic models, to provide reliable indoor navigation services to people with special needs, yet requiring minimum infrastructure changes. Our pilot tests with people with special needs at a multi-floor building in New York City has demonstrated the effectiveness of our proposed framework.