skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on June 26, 2026

Title: An Overview of Smart Wheelchair Technologies
Many individuals who are in need of mobility assistance do not have access to the proper wheelchair for their type of mobility disability. There is growing research towards creating smart wheelchairs using a variety of methods, such as biopotential signals or eye tracking for input and LiDAR, ultrasonic sensors, or using a camera to create a map or track position. There have been other methods, such as voice control, sip and puff, and hand gestures, but there are disadvantages of these that can limit their usefulness. Smart wheelchairs should account for collisions, but also emphasize the safety and comfort of the user. In this paper, we review and classify state-of-the-art research in smart wheelchairs. Many machine learning models are used for various parts of wheelchairs, from mapping and signal processing to input classification. Smart wheelchairs rely on various hardware devices, such as eye trackers, electrode caps, EMG armbands, RPLidar, RGB-cameras, and ultrasonic sensors. Some hybrid models use a combination of methods to account for some of their limitations. Some research has leaned towards training games to help teach users. Future work should include improvement of classification methods for various input signals and improvement on the accessibility of the technology.  more » « less
Award ID(s):
2150484
PAR ID:
10617807
Author(s) / Creator(s):
; ; ;
Editor(s):
Arabnia, Hamid; Deligiannidis, Leonidas; Tinetti, Fernando; Tran, Quoc-Nam
Publisher / Repository:
Springer Nature
Date Published:
ISSN:
1865-0937
ISBN:
1-60132-520-7
Subject(s) / Keyword(s):
Smart wheelchair, assistive technologies, object detection, autonomous navigation, EEG
Format(s):
Medium: X
Location:
USA
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Many elderly individuals have physical restrictions that require the use of a walker to maintain stability while walking. In addition, many of these individuals also have age-related visual impairments that make it difficult to avoid obstacles in unfamiliar environments. To help such users navigate their environment faster, safer and more easily, we propose a smart walker augmented with a collection of ultrasonic sensors as well as a camera. The data collected by the sensors is processed using echo-location based obstacle detection algorithms and deep neural networks based object detection algorithms, respectively. The system alerts the user to obstacles and guides her on a safe path through audio and haptic signals. 
    more » « less
  2. Ronchieri, Elisabetta; Carbone, John; Then, Patrick; Juric, Rewdmila (Ed.)
    The concept of Smart Glasses has evolved since the introduction of early prototypes of the 1990s, only to gain wide acceptance recently. The global smart glasses market has recently experienced significant growth and is pre-dicted for further expansion. They can be tailored to many types of consum-er classes and industries, with several benefits. Like other IoT devices, Smart Glasses may be embedded with sensors to gather data which is then shared with other devices. Machine learning algorithms can be used for processing, sensor fusion, or classification and decision-making. Smart Glasses can also enhance productivity for medical practitioners, allowing them to view patient records or assist them during surgical procedures. New apps utilization such eyewear can also be an assistive technology, enhancing the quality of life for People with Disabilities. The paper provides a review of recent research on the applications of smart glasses, address critical research challenges, and the status of current smart glasses on the market. It also highlights key research issues that should be addressed in the short and long term to bring these powerful tools into mainstream usage. 
    more » « less
  3. null (Ed.)
    Autonomous vehicles (AVs), equipped with numerous sensors such as camera, LiDAR, radar, and ultrasonic sensor, are revolutionizing the transportation industry. These sensors are expected to sense reliable information from a physical environment, facilitating the critical decision-making process of the AVs. Ultrasonic sensors, which detect obstacles in a short distance, play an important role in assisted parking and blind spot detection events. However, due to their weak security level, ultrasonic sensors are particularly vulnerable to signal injection attacks, when the attackers inject malicious acoustic signals to create fake obstacles and intentionally mislead the vehicles to make wrong decisions with disastrous aftermath. In this paper, we systematically analyze the attack model of signal injection attacks toward moving vehicles. By considering the potential threats, we propose SoundFence, a physical-layer defense system which leverages the sensors’ signal processing capability without requiring any additional equipment. SoundFence verifies the benign measurement results and detects signal injection attacks by analyzing sensor readings and the physical-layer signatures of ultrasonic signals. Our experiment with commercial sensors shows that SoundFence detects most (more than 95%) of the abnormal sensor readings with very few false alarms, and it can also accurately distinguish the real echo from injected signals to identify injection attacks. 
    more » « less
  4. We describe the design and performance of a high-fidelity wearable head-, body-, and eye-tracking system that offers significant improvement over previous such devices. This device’s sensors include a binocular eye tracker, an RGB-D scene camera, a high-frame-rate scene camera, and two visual odometry sensors, for a total of ten cameras, which we synchronize and record from with a data rate of over 700 MB/s. The sensors are operated by a mini-PC optimized for fast data collection, and powered by a small battery pack. The device records a subject’s eye, head, and body positions, simultaneously with RGB and depth data from the subject’s visual environment, measured with high spatial and temporal resolution. The headset weighs only 1.4 kg, and the backpack with batteries 3.9 kg. The device can be comfortably worn by the subject, allowing a high degree of mobility. Together, this system overcomes many limitations of previous such systems, allowing high-fidelity characterization of the dynamics of natural vision. 
    more » « less
  5. Abstract Rabbits have been widely used for studying ocular physiology and pathology due to their relatively large eye size and similar structures with human eyes. Various rabbit ocular disease models, such as dry eye, age-related macular degeneration, and glaucoma, have been established. Despite the growing application of proteomics in vision research using rabbit ocular models, there is no spectral assay library for rabbit eye proteome publicly available. Here, we generated spectral assay libraries for rabbit eye compartments, including conjunctiva, cornea, iris, retina, sclera, vitreous humor, and tears using fractionated samples and ion mobility separation enabling deep proteome coverage. The rabbit eye spectral assay library includes 9,830 protein groups and 113,593 peptides. We present the data as a freely available community resource for proteomic studies in the vision field. Instrument data and spectral libraries are available via ProteomeXchange with identifier PXD031194. 
    more » « less