skip to main content

Title: Sparse balance: Excitatory-inhibitory networks with small bias currents and broadly distributed synaptic weights
Cortical circuits generate excitatory currents that must be cancelled by strong inhibition to assure stability. The resulting excitatory-inhibitory (E-I) balance can generate spontaneous irregular activity but, in standard balanced E-I models, this requires that an extremely strong feedforward bias current be included along with the recurrent excitation and inhibition. The absence of experimental evidence for such large bias currents inspired us to examine an alternative regime that exhibits asynchronous activity without requiring unrealistically large feedforward input. In these networks, irregular spontaneous activity is supported by a continually changing sparse set of neurons. To support this activity, synaptic strengths must be drawn from high-variance distributions. Unlike standard balanced networks, these sparse balance networks exhibit robust nonlinear responses to uniform inputs and non-Gaussian input statistics. Interestingly, the speed, not the size, of synaptic fluctuations dictates the degree of sparsity in the model. In addition to simulations, we provide a mean-field analysis to illustrate the properties of these networks.
; ;
Soltani, Alireza
Award ID(s):
Publication Date:
Journal Name:
PLOS Computational Biology
Page Range or eLocation-ID:
Sponsoring Org:
National Science Foundation
More Like this
  1. Gjorgjieva, Julijana (Ed.)
    The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input.
  2. The notion that a neuron transmits the same set of neurotransmitters at all of its post-synaptic connections, typically known as Dale's law, is well supported throughout the majority of the brain and is assumed in almost all theoretical studies investigating the mechanisms for computation in neuronal networks. Dale's law has numerous functional implications in fundamental sensory processing and decision-making tasks, and it plays a key role in the current understanding of the structure-function relationship in the brain. However, since exceptions to Dale's law have been discovered for certain neurons and because other biological systems with complex network structure incorporate individual units that send both positive and negative feedback signals, we investigate the functional implications of network model dynamics that violate Dale's law by allowing each neuron to send out both excitatory and inhibitory signals to its neighbors. We show how balanced network dynamics, in which large excitatory and inhibitory inputs are dynamically adjusted such that input fluctuations produce irregular firing events, are theoretically preserved for a single population of neurons violating Dale's law. We further leverage this single-population network model in the context of two competing pools of neurons to demonstrate that effective decision-making dynamics are also produced, agreeing withmore »experimental observations from honeybee dynamics in selecting a food source and artificial neural networks trained in optimal selection. Through direct comparison with the classical two-population balanced neuronal network, we argue that the one-population network demonstrates more robust balanced activity for systems with less computational units, such as honeybee colonies, whereas the two-population network exhibits a more rapid response to temporal variations in network inputs, as required by the brain. We expect this study will shed light on the role of neurons violating Dale's law found in experiment as well as shared design principles across biological systems that perform complex computations.« less
  3. Biological neural networks face a formidable task: performing reliable computations in the face of intrinsic stochasticity in individual neurons, imprecisely specified synaptic connectivity, and nonnegligible delays in synaptic transmission. A common approach to combatting such biological heterogeneity involves averaging over large redundant networks of N neurons resulting in coding errors that decrease classically as the square root of N. Recent work demonstrated a novel mechanism whereby recurrent spiking networks could efficiently encode dynamic stimuli achieving a superclassical scaling in which coding errors decrease as 1/N. This specific mechanism involved two key ideas: predictive coding, and a tight balance, or cancellation between strong feedforward inputs and strong recurrent feedback. However, the theoretical principles governing the efficacy of balanced predictive coding and its robustness to noise, synaptic weight heterogeneity and communication delays remain poorly understood. To discover such principles, we introduce an analytically tractable model of balanced predictive coding, in which the degree of balance and the degree of weight disorder can be dissociated unlike in previous balanced network models, and we develop a mean-field theory of coding accuracy. Overall, our work provides and solves a general theoretical framework for dissecting the differential contributions neural noise, synaptic disorder, chaos, synaptic delays, andmore »balance to the fidelity of predictive neural codes, reveals the fundamental role that balance plays in achieving superclassical scaling, and unifies previously disparate models in theoretical neuroscience.« less
  4. Abstract Neural activity coordinated across different scales from neuronal circuits to large-scale brain networks gives rise to complex cognitive functions. Bridging the gap between micro- and macro-scale processes, we present a novel framework based on the maximum entropy model to infer a hybrid resting state structural connectome, representing functional interactions constrained by structural connectivity. We demonstrate that the structurally informed network outperforms the unconstrained model in simulating brain dynamics; wherein by constraining the inference model with the network structure we may improve the estimation of pairwise BOLD signal interactions. Further, we simulate brain network dynamics using Monte Carlo simulations with the new hybrid connectome to probe connectome-level differences in excitation-inhibition balance between apolipoprotein E (APOE)-ε4 carriers and noncarriers. Our results reveal sex differences among APOE-ε4 carriers in functional dynamics at criticality; specifically, female carriers appear to exhibit a lower tolerance to network disruptions resulting from increased excitatory interactions. In sum, the new multimodal network explored here enables analysis of brain dynamics through the integration of structure and function, providing insight into the complex interactions underlying neural activity such as the balance of excitation and inhibition.
  5. The vertical lobe (VL) in the octopus brain plays an essential role in its sophisticated learning and memory. Early anatomical studies suggested that the VL is organized in a “fan-out fan-in” connectivity matrix comprising only three morphologically identified neuron types; input axons from the superior frontal lobe (SFL) innervating en passant millions of small amacrine interneurons (AMs) which converge sharply onto large VL output neurons (LNs). Recent physiological studies confirmed the feedforward excitatory connectivity: a glutamatergic synapse at the first SFL-to-AM synaptic layer and a cholinergic AM-to-LNs synapse. SFL-to-AMs synapses show a robust hippocampal-like activity-dependent long-term potentiation (LTP) of transmitter release. 5-HT, octopamine, dopamine, and nitric oxide modulate short- and long-term VL synaptic plasticity. Here we present a comprehensive histolabeling study to better characterize the neural elements in the VL. We generally confirmed glutamatergic SFLs and cholinergic AMs. Intense labeling for NOS activity in the AMs neurites fitted with the NO-dependent presynaptic LTP mechanism at the SFL-to-AM synapse. New discoveries here reveal more heterogeneity of the VL neurons than previously thought. GABAergic AMs suggest a subpopulation of inhibitory interneurons in the first input layer. Clear GABA labeling in the cell bodies of LNs supported an inhibitory VL output yet themore »LNs co-expressed FMRFamide-like neuropeptides suggesting an additional neuromodulatory role of the VL output. Furthermore, a group of LNs was glutamatergic. A new cluster of cells organized in a “deep nucleus” showed rich catecholaminergic labeling and may play a role in intrinsic neuromodulation. In situ hybridization and immunolabeling allowed characterization and localization of a rich array of neuropeptides and neuromodulators, likely involved in reward/punishment signals. This analysis of the fast transmission system, together with the newly found cellular elements helps integrate behavioral, physiological, pharmacological, and connectome findings into a more comprehensive understanding of an efficient learning and memory network.« less