We present a multimodal physics simulation, including visual and auditory (description, sound effects, and sonification) modalities to support the diverse needs of learners. We describe design challenges and solutions, and findings from final simulation evaluations with learners with and without visual impairments. We also share insights from completing research with members of diverse learner groups (N = 52). This work presents approaches for designing and evaluating accessible interactive simulations for learners with diverse needs.
more »
« less
Auditory Display in Interactive Science Simulations: Description and Sonification Support Interaction and Enhance Opportunities for Learning
Science simulations are widely used in classrooms to support inquiry-based learning of complex science concepts. These tools typically rely on interactive visual displays to convey relationships. Auditory displays, including verbal description and sonification (non-speech audio), combined with alternative input capabilities, may provide an enhanced experience for learners, particularly learners with visual impairment. We completed semi-structured interviews and usability testing with eight adult learners with visual impairment for two audio-enhanced simulations. We analyzed trends and edge cases in participants' interaction patterns, interpretations, and preferences. Findings include common interaction patterns across simulation use, increased efficiency with second use, and the complementary role that description and sonification play in supporting learning opportunities. We discuss how these control and display layers work to encourage exploration and engagement with science simulations. We conclude with general and specific design takeaways to support the implementation of auditory displays for accessible simulations.
more »
« less
- Award ID(s):
- 1621363
- PAR ID:
- 10216337
- Date Published:
- Journal Name:
- Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
- Page Range / eLocation ID:
- 1 - 12
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Effective human-robot interaction is increasingly vital across various domains, including assistive robotics, emotional communication, entertainment, and industrial automation. Visual feedback, a common feature of current interfaces, may not be suitable for all environments. Audio feedback serves as a critical supplementary communication layer in settings where visibility is low or where robotic operations generate extensive data. Sonification, which transforms a robot's trajectory, motion, and environmental signals into sound, enhances users' comprehension of robot behavior. This improvement in understanding fosters more effective, safe, and reliable Human-Robot Interaction (HRI). Demonstrations of auditory data sonification's benefits are evident in real-world applications such as industrial assembly, robot-assisted rehabilitation, and interactive robotic exhibitions, where it promotes cooperation, boosts performance, and heightens engagement. Beyond conventional HRI environments, auditory data sonification shows substantial potential in managing complex robotic systems and intricate structures, such as hyper-redundant robots and robotic teams. These systems often challenge operators with complex joint monitoring, mathematical kinematic modeling, and visual behavior verification. This dissertation explores the sonification of motion in hyper-redundant robots and teams of industrial robots. It delves into the Wave Space Sonification (WSS) framework developed by Hermann, applying it to the motion datasets of protein molecules modeled as hyper-redundant mechanisms with numerous rigid nano-linkages. This research leverages the WSS framework to develop a sonification methodology for protein molecules' dihedral angle folding trajectories. Furthermore, it introduces a novel approach for the systematic sonification of robotic motion across varying configurations. By employing localized wave fields oriented within the robots' configuration space, this methodology generates auditory outputs with specific timbral qualities as robots move through predefined configurations or along certain trajectories. Additionally, the dissertation examines a team of wheeled industrial/service robots whose motion patterns are sonified using sinusoidal vibratory sounds, demonstrating the practical applications and benefits of this innovative approach.more » « less
-
null (Ed.)Auditory description display is verbalized text typically used to describe live, recorded, or graphical displays to support access for people who are blind or visually impaired. Significant prior research has resulted in guidelines for auditory description for non-interactive or minimally interactive contexts. A lack of auditory description for complex interactive environments remains a tremendous barrier to access for people with visual impairments. In this work, we present a systematic design framework for designing auditory description within complex interactive environments. We illustrate how modular descriptions aligned with this framework can result in an interactive storytelling experience constructed through user interactions. This framework has been used in a set of published and widely used interactive science simulations, and in its generalized form could be applied to a variety of contexts.more » « less
-
The “Accessible Oceans” pilot project aims to inclusively design auditory displays that support perception and understanding of ocean data in informal learning environments (ILEs). The project’s multi-disciplinary team includes expertise from all related fields — ocean scientists, dataset experts, a sound designer with specialization in data sonification, and a learning sciences researcher. In addition, the PI is blind and provides a crucial perspective in our research. We describe the sound design of informative sonifications and respective auditory displays based on iterative design with user input at each stage, including from blind and low-vision (BLV) students, their teachers, and subject-matter experts. We discuss the importance of framing data sonifications through an auditory presentation of contextual information. We also report on our latest auditory display evaluation using Auditory Interface UX Scale (BUZZ) surveys at three ILE test sites. These responses further affirm our auditory display design developments. We include access to the auditory displays media and lessons learned over the course of this multi-year NSF-funded Advancing Informal Stem Learning (AISL) grant https://accessibleoceans.whoi.edu/more » « less
-
null (Ed.)Abstract Sonification of time series data in natural science has gained increasing attention as an observational and educational tool. Sound is a direct representation for oscillatory data, but for most phenomena, less direct representational methods are necessary. Coupled with animated visual representations of the same data, the visual and auditory systems can work together to identify complex patterns quickly. We developed a multivariate data sonification and visualization approach to explore and convey patterns in a complex dynamic system, Lone Star Geyser in Yellowstone National Park. This geyser has erupted regularly for at least 100 years, with remarkable consistency in the interval between eruptions (three hours) but with significant variations in smaller scale patterns between each eruptive cycle. From a scientific standpoint, the ability to hear structures evolving over time in multiparameter data permits the rapid identification of relationships that might otherwise be overlooked or require significant processing to find. The human auditory system is adept at physical interpretation of call-and-response or causality in polyphonic sounds. Methods developed here for oscillatory and nonstationary data have great potential as scientific observational and educational tools, for data-driven composition with scientific and artistic intent, and towards the development of machine learning tools for pattern identification in complex data.more » « less
An official website of the United States government

