For people who have experienced a spinal cord injury or an amputation, the recovery of sensation and motor control could be incomplete despite noteworthy advances with invasive neural interfaces. Our objective is to explore the feasibility of a novel biohybrid robotic hand model to investigate aspects of tactile sensation and sensorimotor integration with a pre-clinical research platform. Our new biohybrid model couples an artificial hand with biological neural networks (BNN) cultured in a multichannel microelectrode array (MEA). We decoded neural activity to control a finger of the artificial hand that was outfitted with a tactile sensor. The fingertip sensations were encoded into rapidly adapting (RA) or slowly adapting (SA) mechanoreceptor firing patterns that were used to electrically stimulate the BNN. We classified the coherence between afferent and efferent electrodes in the MEA with a convolutional neural network (CNN) using a transfer learning approach. The BNN exhibited the capacity for functional specialization with the RA and SA patterns, represented by significantly different robotic behavior of the biohybrid hand with respect to the tactile encoding method. Furthermore, the CNN was able to distinguish between RA and SA encoding methods with 97.84% ± 0.65% accuracy when the BNN was provided tactile feedback, averaged across three days in vitro (DIV). This novel biohybrid research platform demonstrates that BNNs are sensitive to tactile encoding methods and can integrate robotic tactile sensations with the motor control of an artificial hand. This opens the possibility of using biohybrid research platforms in the future to study aspects of neural interfaces with minimal human risk.
more »
« less
Tactile Sensing at Cryogenic Temperatures Using MichTac Sensors Based on GaN Nanopillar LEDs
Experiments successfully established the feasibility of a nanopillar-LED-based tactile sensor showing tactile perception at extremely cold temperatures.
more »
« less
- Award ID(s):
- 2317047
- PAR ID:
- 10561422
- Publisher / Repository:
- Optica Publishing Group
- Date Published:
- ISBN:
- 978-1-957171-39-5
- Page Range / eLocation ID:
- SF1A.3
- Format(s):
- Medium: X
- Location:
- Charlotte, North Carolina
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)The connection between visual input and tactile sensing is critical for object manipulation tasks such as grasping and pushing. In this work, we introduce the challenging task of estimating a set of tactile physical properties from visual information. We aim to build a model that learns the complex mapping between visual information and tactile physical properties. We construct a first of its kind image-tactile dataset with over 400 multiview image sequences and the corresponding tactile properties. A total of fifteen tactile physical properties across categories including friction, compliance, adhesion, texture, and thermal conductance are measured and then estimated by our models. We develop a cross-modal framework comprised of an adversarial objective and a novel visuo-tactile joint classification loss. Additionally, we introduce a neural architecture search framework capable of selecting optimal combinations of viewing angles for estimating a given physical property.more » « less
-
Tactile graphics are a common way to present information to people with vision impairments. Tactile graphics can be used to explore a broad range of static visual content but aren’t well suited to representing animation or interactivity. We introduce a new approach to creating dynamic tactile graphics that combines a touch screen tablet, static tactile overlays, and small mobile robots. We introduce a prototype system called RoboGraphics and several proof-of-concept applications. We evaluated our prototype with seven participants with varying levels of vision, comparing the RoboGraphics approach to a flat screen, audio-tactile interface. Our results show that dynamic tactile graphics can help visually impaired participants explore data quickly and accurately.more » « less
-
Tactile imaging sensor determines the tumor's mechanical properties such as size, depth, and Young's modulus based on the principle of total internal reflection of light. To improve the classifying accuracy of the Tactile imaging sensor, we introduce ultrasound signals and estimate the difference in the tumor tactile images. A developed vibro-acoustic tactile imaging sensor was used to classify benign and malignant tumors. We test the developed system on breast tumor phantoms. These vibrated tactile images are analyzed to improve the overall performance of tumor detection.more » « less
-
UniT is an approach to tactile representation learn¬ing, using VQGAN to learn a compact latent space and serve as the tactile representation. It uses tactile images obtained from a single simple object to train the representation with generalizability. This tactile representation can be zero-shot transferred to various downstream tasks, including perception tasks and manipulation policy learning. Our benchmarkings on in-hand 3D pose and 6D pose estimation tasks and a tactile classifcation task show that UniT outperforms existing visual and tactile representation learning methods. Additionally, UniT’s effectiveness in policy learning is demonstrated across three real-world tasks involving diverse manipulated objects and complex robot-object-environment interactions. Through extensive experi¬mentation, UniT is shown to be a simple-to-train, plug-and-play, yet widely effective method for tactile representation learning. For more details, please refer to our open-source repository https://github.com/ZhengtongXu/UniT and the project website https://zhengtongxu.github.io/unit-website/.more » « less
An official website of the United States government

