skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: NetPyNE, a tool for data-driven multiscale modeling of brain circuits
The approximately 100 billion neurons in our brain are responsible for everything we do and experience. Experiments aimed at discovering how these cells encode and process information generate vast amounts of data. These data span multiple scales, from interactions between individual molecules to coordinated waves of electrical activity that spread across the entire brain surface. To understand how the brain works, we must combine and make sense of these diverse types of information. Computational modeling provides one way of doing this. Using equations, we can calculate the chemical and electrical changes that take place in neurons. We can then build models of neurons and neural circuits that reproduce the patterns of activity seen in experiments. Exploring these models can provide insights into how the brain itself works. Several software tools are available to simulate neural circuits, but none provide an easy way of incorporating data that span different scales, from molecules to cells to networks. Moreover, most of the models require familiarity with computer programming. Dura-Bernal et al. have now developed a new software tool called NetPyNE, which allows users without programming expertise to build sophisticated models of brain circuits. It features a user-friendly interface for defining the properties of the model at molecular, cellular and circuit scales. It also provides an easy and automated method to identify the properties of the model that enable it to reproduce experimental data. Finally, NetPyNE makes it possible to run the model on supercomputers and offers a variety of ways to visualize and analyze the resulting output. Users can save the model and output in standardized formats, making them accessible to as many people as possible. Researchers in labs across the world have used NetPyNE to study different brain regions, phenomena and diseases. The software also features in courses that introduce students to neurobiology and computational modeling. NetPyNE can help to interpret isolated experimental findings, and also makes it easier to explore interactions between brain activity at different scales. This will enable researchers to decipher how the brain encodes and processes information, and ultimately could make it easier to understand and treat brain disorders.  more » « less
Award ID(s):
1904444
PAR ID:
10300070
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ; ;
Date Published:
Journal Name:
eLife
Volume:
8
ISSN:
2050-084X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Although much is known about how single neurons in the hippocampus represent an animal's position, how circuit interactions contribute to spatial coding is less well understood. Using a novel statistical estimator and theoretical modeling, both developed in the framework of maximum entropy models, we reveal highly structured CA1 cell-cell interactions in male rats during open field exploration. The statistics of these interactions depend on whether the animal is in a familiar or novel environment. In both conditions the circuit interactions optimize the encoding of spatial information, but for regimes that differ in the informativeness of their spatial inputs. This structure facilitates linear decodability, making the information easy to read out by downstream circuits. Overall, our findings suggest that the efficient coding hypothesis is not only applicable to individual neuron properties in the sensory periphery, but also to neural interactions in the central brain. SIGNIFICANCE STATEMENTLocal circuit interactions play a key role in neural computation and are dynamically shaped by experience. However, measuring and assessing their effects during behavior remains a challenge. Here, we combine techniques from statistical physics and machine learning to develop new tools for determining the effects of local network interactions on neural population activity. This approach reveals highly structured local interactions between hippocampal neurons, which make the neural code more precise and easier to read out by downstream circuits, across different levels of experience. More generally, the novel combination of theory and data analysis in the framework of maximum entropy models enables traditional neural coding questions to be asked in naturalistic settings. 
    more » « less
  2. Memories are an important part of how we think, understand the world around us, and plan out future actions. In the brain, memories are thought to be stored in a region called the hippocampus. When memories are formed, neurons store events that occur around the same time together. This might explain why often, in the brains of animals, the activity associated with retrieving memories is not just a snapshot of what happened at a specific moment-- it can also include information about what the animal might experience next. This can have a clear utility if animals use memories to predict what they might experience next and plan out future actions. Mathematically, this notion of predictiveness can be summarized by an algorithm known as the successor representation. This algorithm describes what the activity of neurons in the hippocampus looks like when retrieving memories and making predictions based on them. However, even though the successor representation can computationally reproduce the activity seen in the hippocampus when it is making predictions, it is unclear what biological mechanisms underpin this computation in the brain. Fang et al. approached this problem by trying to build a model that could generate the same activity patterns computed by the successor representation using only biological mechanisms known to exist in the hippocampus. First, they used computational methods to design a network of neurons that had the biological properties of neural networks in the hippocampus. They then used the network to simulate neural activity. The results show that the activity of the network they designed was able to exactly match the successor representation. Additionally, the data resulting from the simulated activity in the network fitted experimental observations of hippocampal activity in Tufted Titmice. One advantage of the network designed by Fang et al. is that it can generate predictions in flexible ways,. That is, it canmake both short and long-term predictions from what an individual is experiencing at the moment. This flexibility means that the network can be used to simulate how the hippocampus learns in a variety of cognitive tasks. Additionally, the network is robust to different conditions. Given that the brain has to be able to store memories in many different situations, this is a promising indication that this network may be a reasonable model of how the brain learns. The results of Fang et al. lay the groundwork for connecting biological mechanisms in the hippocampus at the cellular level to cognitive effects, an essential step to understanding the hippocampus, as well as its role in health and disease. For instance, their network may provide a concrete approach to studying how disruptions to the ways neurons make and break connections can impair memory formation. More generally, better models of the biological mechanisms involved in making computations in the hippocampus can help scientists better understand and test out theories about how memories are formed and stored in the brain. 
    more » « less
  3. Astrocytes are a ubiquitous and enigmatic type of non-neuronal cell and are found in the brain of all vertebrates. While traditionally viewed as being supportive of neurons, it is increasingly recognized that astrocytes play a more direct and active role in brain function and neural computation. On account of their sensitivity to a host of physiological covariates and ability to modulate neuronal activity and connectivity on slower time scales, astrocytes may be particularly well poised to modulate the dynamics of neural circuits in functionally salient ways. In the current paper, we seek to capture these features via actionable abstractions within computational models of neuron-astrocyte interaction. Specifically, we engage how nested feedback loops of neuron-astrocyte interaction, acting over separated time-scales, may endow astrocytes with the capability to enable learning in context-dependent settings, where fluctuations in task parameters may occur much more slowly than within-task requirements. We pose a general model of neuron-synapse-astrocyte interaction and use formal analysis to characterize how astrocytic modulation may constitute a form of meta-plasticity, altering the ways in which synapses and neurons adapt as a function of time. We then embed this model in a bandit-based reinforcement learning task environment, and show how the presence of time-scale separated astrocytic modulation enables learning over multiple fluctuating contexts. Indeed, these networks learn far more reliably compared to dynamically homogeneous networks and conventional non-network-based bandit algorithms. Our results fuel the notion that neuron-astrocyte interactions in the brain benefit learning over different time-scales and the conveyance of task-relevant contextual information onto circuit dynamics. 
    more » « less
  4. Ermentrout, Bard (Ed.)
    During development, neurons arrive at local brain areas in an extended period of time, but how they form local neural circuits is unknown. Here we computationally model the emergence of a network for precise timing in the premotor nucleus HVC in songbird. We show that new projection neurons, added to HVC post hatch at early stages of song development, are recruited to the end of a growing feedforward network. High spontaneous activity of the new neurons makes them the prime targets for recruitment in a self-organized process via synaptic plasticity. Once recruited, the new neurons fire readily at precise times, and they become mature. Neurons that are not recruited become silent and replaced by new immature neurons. Our model incorporates realistic HVC features such as interneurons, spatial distributions of neurons, and distributed axonal delays. The model predicts that the birth order of the projection neurons correlates with their burst timing during the song. 
    more » « less
  5. The circadian system coordinates multiple behavioral outputs to ensure proper temporal organization. Timing information underlying circadian regulation of behavior depends on a molecular circadian clock that operates within clock neurons in the brain. In Drosophila and other organisms, clock neurons can be divided into several molecularly and functionally discrete subpopulations that form an interconnected central clock network. It is unknown how circadian signals are coherently generated by the clock network and transmitted across output circuits that connect clock cells to downstream neurons that regulate behavior. Here, we have exhaustively investigated the contribution of clock neuron subsets to the control of two prominent behavioral outputs in Drosophila: locomotor activity and feeding. We have used cell-specific manipulations to eliminate molecular clock function or induce electrical silencing either broadly throughout the clock network or in specific subpopulations. We find that clock cell manipulations produce similar changes in locomotor activity and feeding, suggesting that overlapping central clock circuitry regulates these distinct behavioral outputs. Interestingly, the magnitude and nature of the effects depend on the clock subset targeted. Lateral clock neuron manipulations profoundly degrade the rhythmicity of feeding and activity. In contrast, dorsal clock neuron manipulations only subtly affect rhythmicity but produce pronounced changes in the distribution of activity and feeding across the day. These experiments expand our knowledge of clock regulation of activity rhythms and offer the first extensive characterization of central clock control of feeding rhythms. Despite similar effects of central clock cell disruptions on activity and feeding, we find that manipulations that prevent functional signaling in an identified output circuit preferentially degrade locomotor activity rhythms, leaving feeding rhythms relatively intact. This demonstrates that activity and feeding are indeed dissociable behaviors, and furthermore suggests that differential circadian control of these behaviors diverges in output circuits downstream of the clock network. 
    more » « less