skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The DOI auto-population feature in the Public Access Repository (PAR) will be unavailable from 4:00 PM ET on Tuesday, July 8 until 4:00 PM ET on Wednesday, July 9 due to scheduled maintenance. We apologize for the inconvenience caused.


Title: Navigation Training for Persons With Visual Disability Through Multisensory Assistive Technology: Mixed Methods Experimental Study
BackgroundVisual disability is a growing problem for many middle-aged and older adults. Conventional mobility aids, such as white canes and guide dogs, have notable limitations that have led to increasing interest in electronic travel aids (ETAs). Despite remarkable progress, current ETAs lack empirical evidence and realistic testing environments and often focus on the substitution or augmentation of a single sense. ObjectiveThis study aims to (1) establish a novel virtual reality (VR) environment to test the efficacy of ETAs in complex urban environments for a simulated visual impairment (VI) and (2) evaluate the impact of haptic and audio feedback, individually and combined, on navigation performance, movement behavior, and perception. Through this study, we aim to address gaps to advance the pragmatic development of assistive technologies (ATs) for persons with VI. MethodsThe VR platform was designed to resemble a subway station environment with the most common challenges faced by persons with VI during navigation. This environment was used to test our multisensory, AT-integrated VR platform among 72 healthy participants performing an obstacle avoidance task while experiencing symptoms of VI. Each participant performed the task 4 times: once with haptic feedback, once with audio feedback, once with both feedback types, and once without any feedback. Data analysis encompassed metrics such as completion time, head and body orientation, and trajectory length and smoothness. To evaluate the effectiveness and interaction of the 2 feedback modalities, we conducted a 2-way repeated measures ANOVA on continuous metrics and a Scheirer-Ray-Hare test on discrete ones. We also conducted a descriptive statistical analysis of participants’ answers to a questionnaire, assessing their experience and preference for feedback modalities. ResultsResults from our study showed that haptic feedback significantly reduced collisions (P=.05) and the variability of the pitch angle of the head (P=.02). Audio feedback improved trajectory smoothness (P=.006) and mitigated the increase in the trajectory length from haptic feedback alone (P=.04). Participants reported a high level of engagement during the experiment (52/72, 72%) and found it interesting (42/72, 58%). However, when it came to feedback preferences, less than half of the participants (29/72, 40%) favored combined feedback modalities. This indicates that a majority preferred dedicated single modalities over combined ones. ConclusionsAT is crucial for individuals with VI; however, it often lacks user-centered design principles. Research should prioritize consumer-oriented methodologies, testing devices in a staged manner with progression toward more realistic, ecologically valid settings to ensure safety. Our multisensory, AT-integrated VR system takes a holistic approach, offering a first step toward enhancing users’ spatial awareness, promoting safer mobility, and holds potential for applications in medical treatment, training, and rehabilitation. Technological advancements can further refine such devices, significantly improving independence and quality of life for those with VI.  more » « less
Award ID(s):
2345139 2236097
PAR ID:
10559615
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
JMIR Publications
Date Published:
Journal Name:
JMIR Rehabilitation and Assistive Technologies
Volume:
11
ISSN:
2369-2529
Page Range / eLocation ID:
e55776
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Technological advancements and increased access have prompted the adoption of head- mounted display based virtual reality (VR) for neuroscientific research, manual skill training, and neurological rehabilitation. Applications that focus on manual interaction within the virtual environment (VE), especially haptic-free VR, critically depend on virtual hand-object collision detection. Knowledge about how multisensory integration related to hand-object collisions affects perception-action dynamics and reach-to-grasp coordination is needed to enhance the immersiveness of interactive VR. Here, we explored whether and to what extent sensory substitution for haptic feedback of hand-object collision (visual, audio, or audiovisual) and collider size (size of spherical pointers representing the fingertips) influences reach-to-grasp kinematics. In Study 1, visual, auditory, or combined feedback were compared as sensory substitutes to indicate the successful grasp of a virtual object during reach-to-grasp actions. In Study 2, participants reached to grasp virtual objects using spherical colliders of different diameters to test if virtual collider size impacts reach-to-grasp. Our data indicate that collider size but not sensory feedback modality significantly affected the kinematics of grasping. Larger colliders led to a smaller size-normalized peak aperture. We discuss this finding in the context of a possible influence of spherical collider size on the perception of the virtual object’s size and hence effects on motor planning of reach-to-grasp. Critically, reach-to-grasp spatiotemporal coordination patterns were robust to manipulations of sensory feedback modality and spherical collider size, suggesting that the nervous system adjusted the reach (transport) component commensurately to the changes in the grasp (aperture) component. These results have important implications for research, commercial, industrial, and clinical applications of VR. 
    more » « less
  2. Nakayama, Luis Filipe (Ed.)
    Visual impairment represents a significant health and economic burden affecting 596 million globally. The incidence of visual impairment is expected to double by 2050 as our population ages. Independent navigation is challenging for persons with visual impairment, as they often rely on non-visual sensory signals to find the optimal route. In this context, electronic travel aids are promising solutions that can be used for obstacle detection and/or route guidance. However, electronic travel aids have limitations such as low uptake and limited training that restrict their widespread use. Here, we present a virtual reality platform for testing, refining, and training with electronic travel aids. We demonstrate the viability on an electronic travel aid developed in-house, consist of a wearable haptic feedback device. We designed an experiment in which participants donned the electronic travel aid and performed a virtual task while experiencing a simulation of three different visual impairments: age-related macular degeneration, diabetic retinopathy, and glaucoma. Our experiments indicate that our electronic travel aid significantly improves the completion time for all the three visual impairments and reduces the number of collisions for diabetic retinopathy and glaucoma. Overall, the combination of virtual reality and electronic travel aid may have a beneficial role on mobility rehabilitation of persons with visual impairment, by allowing early-phase testing of electronic travel aid prototypes in safe, realistic, and controllable settings. 
    more » « less
  3. Haptic devices are in general more adept at mimicking the bulk properties of materials than they are at mimicking the surface properties. Herein, a haptic glove is described which is capable of producing sensations reminiscent of three types of near‐surface properties: hardness, temperature, and roughness. To accomplish this mixed mode of stimulation, three types of haptic actuators are combined: vibrotactile motors, thermoelectric devices, and electrotactile electrodes made from a stretchable conductive polymer synthesized in the laboratory. This polymer consists of a stretchable polyanion which serves as a scaffold for the polymerization of poly(3,4‐ethylenedioxythiophene). The scaffold is synthesized using controlled radical polymerization to afford material of low dispersity, relatively high conductivity, and low impedance relative to metals. The glove is equipped with flex sensors to make it possible to control a robotic hand and a hand in virtual reality (VR). In psychophysical experiments, human participants are able to discern combinations of electrotactile, vibrotactile, and thermal stimulation in VR. Participants trained to associate these sensations with roughness, hardness, and temperature have an overall accuracy of 98%, whereas untrained participants have an accuracy of 85%. Sensations can similarly be conveyed using a robotic hand equipped with sensors for pressure and temperature. 
    more » « less
  4. Navigation assistive technologies have been designed to support individuals with visual impairments during independent mobility by providing sensory augmentation and contextual awareness of their surroundings. Such information is habitually provided through predefned audio-haptic interaction paradigms. However, individual capabilities, preferences and behavior of people with visual impairments are heterogeneous, and may change due to experience, context and necessity. Therefore, the circumstances and modalities for providing navigation assistance need to be personalized to different users, and through time for each user. We conduct a study with 13 blind participants to explore how the desirability of messages provided during assisted navigation varies based on users' navigation preferences and expertise. The participants are guided through two different routes, one without prior knowledge and one previously studied and traversed. The guidance is provided through turn-by-turn instructions, enriched with contextual information about the environment. During navigation and follow-up interviews, we uncover that participants have diversifed needs for navigation instructions based on their abilities and preferences. Our study motivates the design of future navigation systems capable of verbosity level personalization in order to keep the users engaged in the current situational context while minimizing distractions. 
    more » « less
  5. Abstract ROV operations are mainly performed via a traditional control kiosk and limited data feedback methods, such as the use of joysticks and camera view displays equipped on a surface vessel. This traditional setup requires significant personnel on board (POB) time and imposes high requirements for personnel training. This paper proposes a virtual reality (VR) based haptic-visual ROV teleoperation system that can substantially simplify ROV teleoperation and enhance the remote operator's situational awareness. This study leverages the recent development in Mixed Reality (MR) technologies, sensory augmentation, sensing technologies, and closed-loop control, to visualize and render complex underwater environmental data in an intuitive and immersive way. The raw sensor data will be processed with physics engine systems and rendered as a high-fidelity digital twin model in game engines. Certain features will be visualized and displayed via the VR headset, whereas others will be manifested as haptic and tactile cues via our haptic feedback systems. We applied a simulation approach to test the developed system. With our developed system, a high-fidelity subsea environment is reconstructed based on the sensor data collected from an ROV including the bathymetric, hydrodynamic, visual, and vehicle navigational measurements. Specifically, the vehicle is equipped with a navigation sensor system for real-time state estimation, an acoustic Doppler current profiler for far-field flow measurement, and a bio-inspired artificial literal-line hydrodynamic sensor system for near-field small-scale hydrodynamics. Optimized game engine rendering algorithms then visualize key environmental features as augmented user interface elements in a VR headset, such as color-coded vectors, to indicate the environmental impact on the performance and function of the ROV. In addition, augmenting environmental feedback such as hydrodynamic forces are translated into patterned haptic stimuli via a haptic suit for indicating drift-inducing flows in the near field. A pilot case study was performed to verify the feasibility and effectiveness of the system design in a series of simulated ROV operation tasks. ROVs are widely used in subsea exploration and intervention tasks, playing a critical role in offshore inspection, installation, and maintenance activities. The innovative ROV teleoperation feedback and control system will lower the barrier for ROV pilot jobs. 
    more » « less