Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Neural learning rules for generating flexible predictions and computing the successor representationMemories are an important part of how we think, understand the world around us, and plan out future actions. In the brain, memories are thought to be stored in a region called the hippocampus. When memories are formed, neurons store events that occur around the same time together. This might explain why often, in the brains of animals, the activity associated with retrieving memories is not just a snapshot of what happened at a specific moment-- it can also include information about what the animal might experience next. This can have a clear utility if animals use memories to predict what they might experience next and plan out future actions. Mathematically, this notion of predictiveness can be summarized by an algorithm known as the successor representation. This algorithm describes what the activity of neurons in the hippocampus looks like when retrieving memories and making predictions based on them. However, even though the successor representation can computationally reproduce the activity seen in the hippocampus when it is making predictions, it is unclear what biological mechanisms underpin this computation in the brain. Fang et al. approached this problem by trying to build a model that could generate the same activity patterns computedmore »Free, publicly-accessible full text available March 16, 2024
-
Soltani, Alireza (Ed.)Cortical circuits generate excitatory currents that must be cancelled by strong inhibition to assure stability. The resulting excitatory-inhibitory (E-I) balance can generate spontaneous irregular activity but, in standard balanced E-I models, this requires that an extremely strong feedforward bias current be included along with the recurrent excitation and inhibition. The absence of experimental evidence for such large bias currents inspired us to examine an alternative regime that exhibits asynchronous activity without requiring unrealistically large feedforward input. In these networks, irregular spontaneous activity is supported by a continually changing sparse set of neurons. To support this activity, synaptic strengths must be drawn from high-variance distributions. Unlike standard balanced networks, these sparse balance networks exhibit robust nonlinear responses to uniform inputs and non-Gaussian input statistics. Interestingly, the speed, not the size, of synaptic fluctuations dictates the degree of sparsity in the model. In addition to simulations, we provide a mean-field analysis to illustrate the properties of these networks.
-
We developed a neural network model that can account for major elements common to human focal seizures. These include the tonic-clonic transition, slow advance of clinical semiology and corresponding seizure territory expansion, widespread EEG synchronization, and slowing of the ictal rhythm as the seizure approaches termination. These were reproduced by incorporating usage-dependent exhaustion of inhibition in an adaptive neural network that receives global feedback inhibition in addition to local recurrent projections. Our model proposes mechanisms that may underline common EEG seizure onset patterns and status epilepticus, and postulates a role for synaptic plasticity in the emergence of epileptic foci. Complex patterns of seizure activity and bi-stable seizure end-points arise when stochastic noise is included. With the rapid advancement of clinical and experimental tools, we believe that this model can provide a roadmap and potentially an in silico testbed for future explorations of seizure mechanisms and clinical therapies.
-
Appropriate generalization of learned responses to new situations is vital for adaptive behavior. We provide a circuit-level account of generalization in the electrosensory lobe (ELL) of weakly electric mormyrid fish. Much is already known in this system about a form of learning in which motor corollary discharge signals cancel responses to the uninformative input evoked by the fish’s own electric pulses. However, for this cancellation to be useful under natural circumstances, it must generalize accurately across behavioral regimes, specifically different electric pulse rates. We show that such generalization indeed occurs in ELL neurons, and develop a circuit-level model explaining how this may be achieved. The mechanism involves regularized synaptic plasticity and an approximate matching of the temporal dynamics of motor corollary discharge and electrosensory inputs. Recordings of motor corollary discharge signals in mossy fibers and granule cells provide direct evidence for such matching.
-
Different coding strategies are used to represent odor information at various stages of the mammalian olfactory system. A temporal latency code represents odor identity in olfactory bulb (OB), but this temporal information is discarded in piriform cortex (PCx) where odor identity is instead encoded through ensemble membership. We developed a spiking PCx network model to understand how this transformation is implemented. In the model, the impact of OB inputs activated earliest after inhalation is amplified within PCx by diffuse recurrent collateral excitation, which then recruits strong, sustained feedback inhibition that suppresses the impact of later-responding glomeruli. We model increasing odor concentrations by decreasing glomerulus onset latencies while preserving their activation sequences. This produces a multiplexed cortical odor code in which activated ensembles are robust to concentration changes while concentration information is encoded through population synchrony. Our model demonstrates how PCx circuitry can implement multiplexed ensemble-identity/temporal-concentration odor coding.