skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: SNS-Toolbox: A Tool for Efficient Simulation of Synthetic Nervous Systems
We introduce SNS-Toolbox, a Python software package for the design and simulation of networks of conductance-based neurons and synapses, also called Synthetic Nervous Systems (SNS). SNS-Toolbox implements non-spiking and spiking neurons in multiple software backends, and is capable of simulating networks with thousands of neurons in real-time. We benchmark the toolbox simulation speed across multiple network sizes, characterize upper limits on network size in various scenarios, and showcase the design of a two-layer convolutional network inspired by circuits within the Drosophila melanogaster optic lobe. SNSToolbox, as well as the code to generate all of the figures in this work, is located at https://github.com/wnourse05/SNS-Toolbox.  more » « less
Award ID(s):
1704436 2015317
PAR ID:
10358686
Author(s) / Creator(s):
Date Published:
Journal Name:
Biomimetic and Biohybrid Systems. Living Machines 2022. Lecture Notes in Computer Science. Springer, Cham
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. One developing approach for robotic control is the use of networks of dynamic neurons connected with conductance-based synapses, also known as Synthetic Nervous Systems (SNS). These networks are often developed using cyclic topologies and heterogeneous mixtures of spiking and non-spiking neurons, which is a difficult proposition for existing neural simulation software. Most solutions apply to either one of two extremes, the detailed multi-compartment neural models in small networks, and the large-scale networks of greatly simplified neural models. In this work, we present our open-source Python package SNS-Toolbox, which is capable of simulating hundreds to thousands of spiking and non-spiking neurons in real-time or faster on consumer-grade computer hardware. We describe the neural and synaptic models supported by SNS-Toolbox, and provide performance on multiple software and hardware backends, including GPUs and embedded computing platforms. We also showcase two examples using the software, one for controlling a simulated limb with muscles in the physics simulator Mujoco, and another for a mobile robot using ROS. We hope that the availability of this software will reduce the barrier to entry when designing SNS networks, and will increase the prevalence of SNS networks in the field of robotic control. 
    more » « less
  2. In this dissertation I present SNS-Toolbox, an open-source software package for the design and simulation of networks of biologically inspired neurons and synapses, also known as synthetic nervous systems (SNS). SNS-Toolbox allows SNS networks to be designed using a lightweight Python API, simulated in real-time on consumer computer hardware, and executed onboard physical robotic systems. I also present a companion package to SNS-Toolbox which allows simulation and training of large SNS networks using gradient backpropagation. This software is released under an open-source license with online documentation for ease of use, and has been disseminated to other researchers for their use. As a demonstration, I use SNS-Toolbox to implement a stereo visual motion detector, based on circuitry present within the Drosophila melanogaster (fruit fly) optic lobe. This network analyzes local motion at each point within a visual field, and returns an estimate of global motion when subjected to grating stimuli. Finally I showcase the design of FlyWheel, a robotic benchmark for studying models of insect vision and applying SNS networks to physical hardware. This body of work marks the first tool which is capable of simulating SNS networks with hundreds to thousands of neurons and synaptic connections in real-time or faster, optimize networks with chemical reversal potentials using gradient backpropagation, and interface these networks for control of external systems. 
    more » « less
  3. null (Ed.)
    We present research in the modeling of neurons within Drosophila (fruit fly) olfaction. We describe the process from data collection, to model creation, and spike generation. Our approach utilizes computational elements such as spiking neural networks that employ leaky integrate-and-fire neurons with adaptive firing behavior that more closely mimick biological neurons. We describe the methods of several learning implementations in both software and hardware. Finally, we present both quantitative and qualitative results on learning spiking neural network models. 
    more » « less
  4. We consider the task of measuring time with probabilistic threshold gates implemented by bio-inspired spiking neurons. In the model of spiking neural networks, network evolves in discrete rounds, where in each round, neurons fire in pulses in response to a sufficiently high membrane potential. This potential is induced by spikes from neighboring neurons that fired in the previous round, which can have either an excitatory or inhibitory effect. Discovering the underlying mechanisms by which the brain perceives the duration of time is one of the largest open enigma in computational neuroscience. To gain a better algorithmic understanding onto these processes, we introduce the neural timer problem. In this problem, one is given a time parameter t, an input neuron x, and an output neuron y. It is then required to design a minimum sized neural network (measured by the number of auxiliary neurons) in which every spike from x in a given round i, makes the output y fire for the subsequent t consecutive rounds.We first consider a deterministic implementation of a neural timer and show that Θ(logt)(deterministic) threshold gates are both sufficient and necessary. This raised the question of whether randomness can be leveraged to reduce the number of neurons. We answer this question in the affirmative by considering neural timers with spiking neurons where the neuron y is required to fire for t consecutive rounds with probability at least 1−δ, and should stop firing after at most 2 t rounds with probability 1−δ for some input parameter δ∈(0,1). Our key result is a construction of a neural timer with O(log log 1/δ) spiking neurons. Interestingly, this construction uses only one spiking neuron, while the remaining neurons can be deterministic threshold gates. We complement this construction with a matching lower bound of Ω(min{log log 1/δ,logt}) neurons. This provides the first separation between deterministic and randomized constructions in the setting of spiking neural networks.Finally, we demonstrate the usefulness of compressed counting networks for synchronizing neural networks. In the spirit of distributed synchronizers [Awerbuch-Peleg, FOCS’90], we provide a general transformation (or simulation) that can take any synchronized network solution and simulate it in an asynchronous setting (where edges have arbitrary response latencies) while incurring a small overhead w.r.t the number of neurons and computation time. 
    more » « less
  5. It is widely assumed that distributed neuronal networks are fundamental to the functioning of the brain. Consistent spike timing between neurons is thought to be one of the key principles for the formation of these networks. This can involve synchronous spiking or spiking with time delays, forming spike sequences when the order of spiking is consistent. Finding networks defined by their sequence of time-shifted spikes, denoted here as spike timing networks, is a tremendous challenge. As neurons can participate in multiple spike sequences at multiple between-spike time delays, the possible complexity of networks is prohibitively large. We present a novel approach that is capable of (1) extracting spike timing networks regardless of their sequence complexity, and (2) that describes their spiking sequences with high temporal precision. We achieve this by decomposing frequency-transformed neuronal spiking into separate networks, characterizing each network’s spike sequence by a time delay per neuron, forming a spike sequence timeline. These networks provide a detailed template for an investigation of the experimental relevance of their spike sequences. Using simulated spike timing networks, we show network extraction is robust to spiking noise, spike timing jitter, and partial occurrences of the involved spike sequences. Using rat multi-neuron recordings, we demonstrate the approach is capable of revealing real spike timing networks with sub-millisecond temporal precision. By uncovering spike timing networks, the prevalence, structure, and function of complex spike sequences can be investigated in greater detail, allowing us to gain a better understanding of their role in neuronal functioning. 
    more » « less