skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Sonification and Animation of Multivariate Data to Illuminate Dynamics of Geyser Eruptions
Abstract Sonification of time series data in natural science has gained increasing attention as an observational and educational tool. Sound is a direct representation for oscillatory data, but for most phenomena, less direct representational methods are necessary. Coupled with animated visual representations of the same data, the visual and auditory systems can work together to identify complex patterns quickly. We developed a multivariate data sonification and visualization approach to explore and convey patterns in a complex dynamic system, Lone Star Geyser in Yellowstone National Park. This geyser has erupted regularly for at least 100 years, with remarkable consistency in the interval between eruptions (three hours) but with significant variations in smaller scale patterns between each eruptive cycle. From a scientific standpoint, the ability to hear structures evolving over time in multiparameter data permits the rapid identification of relationships that might otherwise be overlooked or require significant processing to find. The human auditory system is adept at physical interpretation of call-and-response or causality in polyphonic sounds. Methods developed here for oscillatory and nonstationary data have great potential as scientific observational and educational tools, for data-driven composition with scientific and artistic intent, and towards the development of machine learning tools for pattern identification in complex data.  more » « less
Award ID(s):
1848554
PAR ID:
10233907
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Computer Music Journal
Volume:
44
Issue:
1
ISSN:
0148-9267
Page Range / eLocation ID:
35 to 50
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Science simulations are widely used in classrooms to support inquiry-based learning of complex science concepts. These tools typically rely on interactive visual displays to convey relationships. Auditory displays, including verbal description and sonification (non-speech audio), combined with alternative input capabilities, may provide an enhanced experience for learners, particularly learners with visual impairment. We completed semi-structured interviews and usability testing with eight adult learners with visual impairment for two audio-enhanced simulations. We analyzed trends and edge cases in participants' interaction patterns, interpretations, and preferences. Findings include common interaction patterns across simulation use, increased efficiency with second use, and the complementary role that description and sonification play in supporting learning opportunities. We discuss how these control and display layers work to encourage exploration and engagement with science simulations. We conclude with general and specific design takeaways to support the implementation of auditory displays for accessible simulations. 
    more » « less
  2. Effective human-robot interaction is increasingly vital across various domains, including assistive robotics, emotional communication, entertainment, and industrial automation. Visual feedback, a common feature of current interfaces, may not be suitable for all environments. Audio feedback serves as a critical supplementary communication layer in settings where visibility is low or where robotic operations generate extensive data. Sonification, which transforms a robot's trajectory, motion, and environmental signals into sound, enhances users' comprehension of robot behavior. This improvement in understanding fosters more effective, safe, and reliable Human-Robot Interaction (HRI). Demonstrations of auditory data sonification's benefits are evident in real-world applications such as industrial assembly, robot-assisted rehabilitation, and interactive robotic exhibitions, where it promotes cooperation, boosts performance, and heightens engagement. Beyond conventional HRI environments, auditory data sonification shows substantial potential in managing complex robotic systems and intricate structures, such as hyper-redundant robots and robotic teams. These systems often challenge operators with complex joint monitoring, mathematical kinematic modeling, and visual behavior verification. This dissertation explores the sonification of motion in hyper-redundant robots and teams of industrial robots. It delves into the Wave Space Sonification (WSS) framework developed by Hermann, applying it to the motion datasets of protein molecules modeled as hyper-redundant mechanisms with numerous rigid nano-linkages. This research leverages the WSS framework to develop a sonification methodology for protein molecules' dihedral angle folding trajectories. Furthermore, it introduces a novel approach for the systematic sonification of robotic motion across varying configurations. By employing localized wave fields oriented within the robots' configuration space, this methodology generates auditory outputs with specific timbral qualities as robots move through predefined configurations or along certain trajectories. Additionally, the dissertation examines a team of wheeled industrial/service robots whose motion patterns are sonified using sinusoidal vibratory sounds, demonstrating the practical applications and benefits of this innovative approach. 
    more » « less
  3. Chemotaxis is the ability of certain microscopic organisms to sense and swim towards beneficial or away from detrimental chemicals in their surroundings. Identifying this behavior is important for understanding the relationships between species and their environments in the natural world. Predicting migration of an entire population from known characteristics of individual microorganisms is a key contribution, but can be a laborious process and requires watching and waiting for visual evidence of the process on a large population scale. Sonification offers a novel solution to this problem by allowing the observer to tap into our auditory sensory system to process information. In this project, we developed and assessed a proof-of-concept sonification tool as a high throughput, real-time screening tool for chemotaxis in populations of swimming bacteria. The tool operates by reporting the y-axis position of bacteria that appear in the microscope image as microsecond duration pitched notes, giving the user a sense of the average location of the population. In this paper, we present how it has been used as a chemotaxis assay and as a tool to locate traveling waves of bacteria as they pass through the field of view in order to capture data at specific timepoints, which is used to analyze individual swimming patterns of microorganisms within the wave. 
    more » « less
  4. Abstract The auditory system comprises multiple subcortical brain structures that process and refine incoming acoustic signals along the primary auditory pathway. Due to technical limitations of imaging small structures deep inside the brain, most of our knowledge of the subcortical auditory system is based on research in animal models using invasive methodologies. Advances in ultrahigh-field functional magnetic resonance imaging (fMRI) acquisition have enabled novel noninvasive investigations of the human auditory subcortex, including fundamental features of auditory representation such as tonotopy and periodotopy. However, functional connectivity across subcortical networks is still underexplored in humans, with ongoing development of related methods. Traditionally, functional connectivity is estimated from fMRI data with full correlation matrices. However, partial correlations reveal the relationship between two regions after removing the effects of all other regions, reflecting more direct connectivity. Partial correlation analysis is particularly promising in the ascending auditory system, where sensory information is passed in an obligatory manner, from nucleus to nucleus up the primary auditory pathway, providing redundant but also increasingly abstract representations of auditory stimuli. While most existing methods for learning conditional dependency structures based on partial correlations assume independently and identically Gaussian distributed data, fMRI data exhibit significant deviations from Gaussianity as well as high-temporal autocorrelation. In this paper, we developed an autoregressive matrix-Gaussian copula graphical model (ARMGCGM) approach to estimate the partial correlations and thereby infer the functional connectivity patterns within the auditory system while appropriately accounting for autocorrelations between successive fMRI scans. Our results show strong positive partial correlations between successive structures in the primary auditory pathway on each side (left and right), including between auditory midbrain and thalamus, and between primary and associative auditory cortex. These results are highly stable when splitting the data in halves according to the acquisition schemes and computing partial correlations separately for each half of the data, as well as across cross-validation folds. In contrast, full correlation-based analysis identified a rich network of interconnectivity that was not specific to adjacent nodes along the pathway. Overall, our results demonstrate that unique functional connectivity patterns along the auditory pathway are recoverable using novel connectivity approaches and that our connectivity methods are reliable across multiple acquisitions. 
    more » « less
  5. In this paper, we introduce a creative pipeline to incorporate physiological and behavioral data from contemporary marine mammal research into data-driven animations, leveraging functionality from industry tools and custom scripts to promote scientific insights, public awareness, and conservation outcomes. Our framework can flexibly transform data describing animals’ orientation, position, heart rate, and swimming stroke rate to control the position, rotation, and behavior of 3D models, to render animations, and to drive data sonification. Additionally, we explore the challenges of unifying disparate datasets gathered by an interdisciplinary team of researchers, and outline our design process for creating meaningful data visualization tools and animations. As part of our pipeline, we clean and process raw acceleration and electrophysiological signals to expedite complex multi-stream data analysis and the identification of critical foraging and escape behaviors. We provide details about four animation projects illustrating marine mammal datasets. These animations, commissioned by scientists to achieve outreach and conservation outcomes, have successfully increased the reach and engagement of the scientific projects they describe. These impactful visualizations help scientists identify behavioral responses to disturbance, increase public awareness of human-caused disturbance, and help build momentum for targeted conservation efforts backed by scientific evidence. 
    more » « less