People with blindness have limited access to the high-resolution graphical data and imagery of science. Here, a lithophane codex is reported. Its pages display tactile and optical readouts for universal visualization of data by persons with or without eyesight. Prototype codices illustrated microscopy of butterfly chitin—fromN-acetylglucosamine monomer to fibril, scale, and whole insect—and were given to high schoolers from the Texas School for the Blind and Visually Impaired. Lithophane graphics of Fischer-Spier esterification reactions and electron micrographs of biological cells were also 3D-printed, along with x-ray structures of proteins (as millimeter-scale 3D models). Students with blindness could visualize (describe, recall, distinguish) these systems—for the first time—at the same resolution as sighted peers (average accuracy = 88%). Tactile visualization occurred alongside laboratory training, synthesis, and mentoring by chemists with blindness, resulting in increased student interest and sense of belonging in science.
more »
« less
Visualizing 3D imagery by mouth using candy-like models
Handheld models help students visualize three-dimensional (3D) objects, especially students with blindness who use large 3D models to visualize imagery by hand. The mouth has finer tactile sensors than hand, which could improve visualization using microscopic models that are portable, inexpensive, and disposable. The mouth remains unused in tactile learning. Here, we created bite-size 3D models of protein molecules from “gummy bear” gelatin or nontoxic resin. Models were made as small as rice grain and could be coded with flavor and packaged like candy. Mouth, hands, and eyesight were tested at identifying specific structures. Students recognized structures by mouth at 85.59% accuracy, similar to recognition by eyesight using computer animation. Recall accuracy of structures was higher by mouth than hand for 40.91% of students, equal for 31.82%, and lower for 27.27%. The convenient use of entire packs of tiny, cheap, portable models can make 3D imagery more accessible to students.
more »
« less
- Award ID(s):
- 1856449
- PAR ID:
- 10287707
- Date Published:
- Journal Name:
- Science Advances
- Volume:
- 7
- Issue:
- 22
- ISSN:
- 2375-2548
- Page Range / eLocation ID:
- eabh0691
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Teachers of the visually impaired (TVIs) regularly present tactile materials (tactile graphics, 3D models, and real objects) to students with vision impairments. Researchers have been increasingly interested in designing tools to support the use of tactile materials, but we still lack an in-depth understanding of how tactile materials are created and used in practice today. To address this gap, we conducted interviews with 21 TVIs and a 3-week diary study with eight of them. We found that tactile materials were regularly used for academic as well as non-academic concepts like tactile literacy, motor ability, and spatial awareness. Real objects and 3D models served as “stepping stones” to tactile graphics and our participants preferred to teach with 3D models, despite finding them difficult to create, obtain, and modify. Use of certain materials also carried social implications; participants selected materials that fostered student independence and allow classroom inclusion. We contribute design considerations, encouraging future work on tactile materials to enable student and TVI co-creation, facilitate rapid prototyping, and promote movement and spatial awareness. To support future research in this area, our paper provides a fundamental understanding of current practices. We bridge these practices to established pedagogical approaches and highlight opportunities for growth regarding this important genre of educational materials.more » « less
-
Multifunctional flexible tactile sensors could be useful to improve the control of prosthetic hands. To that end, highly stretchable liquid metal tactile sensors (LMS) were designed, manufactured via photolithography, and incorporated into the fingertips of a prosthetic hand. Three novel contributions were made with the LMS. First, individual fingertips were used to distinguish between different speeds of sliding contact with different surfaces. Second, differences in surface textures were reliably detected during sliding contact. Third, the capacity for hierarchical tactile sensor integration was demonstrated by using four LMS signals simultaneously to distinguish between ten complex multi-textured surfaces. Four different machine learning algorithms were compared for their successful classification capabilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the LMSs were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 ± 0.8% accuracy to distinguish between ten different multi-textured surfaces using four LMSs from four fingers simultaneously. The capability for hierarchical multi-finger tactile sensation integration could be useful to provide a higher level of intelligence for artificial hands.more » « less
-
For people who have experienced a spinal cord injury or an amputation, the recovery of sensation and motor control could be incomplete despite noteworthy advances with invasive neural interfaces. Our objective is to explore the feasibility of a novel biohybrid robotic hand model to investigate aspects of tactile sensation and sensorimotor integration with a pre-clinical research platform. Our new biohybrid model couples an artificial hand with biological neural networks (BNN) cultured in a multichannel microelectrode array (MEA). We decoded neural activity to control a finger of the artificial hand that was outfitted with a tactile sensor. The fingertip sensations were encoded into rapidly adapting (RA) or slowly adapting (SA) mechanoreceptor firing patterns that were used to electrically stimulate the BNN. We classified the coherence between afferent and efferent electrodes in the MEA with a convolutional neural network (CNN) using a transfer learning approach. The BNN exhibited the capacity for functional specialization with the RA and SA patterns, represented by significantly different robotic behavior of the biohybrid hand with respect to the tactile encoding method. Furthermore, the CNN was able to distinguish between RA and SA encoding methods with 97.84% ± 0.65% accuracy when the BNN was provided tactile feedback, averaged across three days in vitro (DIV). This novel biohybrid research platform demonstrates that BNNs are sensitive to tactile encoding methods and can integrate robotic tactile sensations with the motor control of an artificial hand. This opens the possibility of using biohybrid research platforms in the future to study aspects of neural interfaces with minimal human risk.more » « less
-
Astronomy combines the richness and complexity of science and mathematics and captivates the public imagination. While much of astronomy is presented as imagery, many individuals can benefit from alternative methods of presentation such as tactile resources. Within the suite of methods and technologies we employ, we explored the utility of 3D printing to translate astronomical research data and models into tactile, textured forms. As ground work for our program, the STEM Career Exploration Lab (STEM-CEL), we extensively tested the 3D design, developed unique templates for 3D prints, and subsequently incorporated these materials into publicly accessible programs and more formally into in summer camps specifically for students including those with blindness and visual impairment (B/VI) and their educators. This paper traces the important steps of public testing to ensure our 3D prints are robust, understandable, and represent the scientific research data and models with integrity. Our initial testbed program also included a STEM camp project where we assessed students' and educators' interactions with the materials. We determined the 3D prints do stimulate interest in science as well 3D printing technology. The successful pilot testing outcome was integrated in our strategy for our more ambitious program, the STEM-CEL. In this paper, we also briefly discuss the results of the initial testing as well as some specific results from the STEM-CEL regarding our 3D prints for star clusters and galaxies. We used pre- and post-intervention surveys, astronomy assessments, and student and educator interviews, resulting in what is likely the largest research study on astronomy especially for students with B/VI. We found that the experience of holding a planet, the Sun, a star cluster, or a model of a galaxy resonates well with even the most casual interest in astronomy.more » « less
An official website of the United States government

