skip to main content


Search for: All records

Creators/Authors contains: "Sejnowski, Terrence J."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Jędrzejewska-Szmek, Joanna (Ed.)
    Chemical synapses exhibit a diverse array of internal mechanisms that affect the dynamics of transmission efficacy. Many of these processes, such as release of neurotransmitter and vesicle recycling, depend strongly on activity-dependent influx and accumulation of Ca 2+ . To model how each of these processes may affect the processing of information in neural circuits, and how their dysfunction may lead to disease states, requires a computationally efficient modelling framework, capable of generating accurate phenomenology without incurring a heavy computational cost per synapse. Constructing a phenomenologically realistic model requires the precise characterization of the timing and probability of neurotransmitter release. Difficulties arise in that functional forms of instantaneous release rate can be difficult to extract from noisy data without running many thousands of trials, and in biophysical synapses, facilitation of per-vesicle release probability is confounded by depletion. To overcome this, we obtained traces of free Ca 2+ concentration in response to various action potential stimulus trains from a molecular MCell model of a hippocampal Schaffer collateral axon. Ca 2+ sensors were placed at varying distance from a voltage-dependent calcium channel (VDCC) cluster, and Ca 2+ was buffered by calbindin. Then, using the calcium traces to drive deterministic state vector models of synaptotagmin 1 and 7 (Syt-1/7), which respectively mediate synchronous and asynchronous release in excitatory hippocampal synapses, we obtained high-resolution profiles of instantaneous release rate, to which we applied functional fits. Synchronous vesicle release occurred predominantly within half a micron of the source of spike-evoked Ca 2+ influx, while asynchronous release occurred more consistently at all distances. Both fast and slow mechanisms exhibited multi-exponential release rate curves, whose magnitudes decayed exponentially with distance from the Ca 2+ source. Profile parameters facilitate on different time scales according to a single, general facilitation function. These functional descriptions lay the groundwork for efficient mesoscale modelling of vesicular release dynamics. 
    more » « less
  2. One of the simplest mathematical models in the study of nonlinear systems is the Kuramoto model, which describes synchronization in systems from swarms of insects to superconductors. We have recently found a connection between the original, real-valued nonlinear Kuramoto model and a corresponding complex-valued system that permits describing the system in terms of a linear operator and iterative update rule. We now use this description to investigate three major synchronization phenomena in Kuramoto networks (phase synchronization, chimera states, and traveling waves), not only in terms of steady state solutions but also in terms of transient dynamics and individual simulations. These results provide new mathematical insight into how sophisticated behaviors arise from connection patterns in nonlinear networked systems. 
    more » « less
  3. Progress in computational neuroscience toward understanding brain function is challenged both by the complexity of molecular-scale electrochemical interactions at the level of individual neurons and synapses and the dimensionality of network dynamics across the brain covering a vast range of spatial and temporal scales. Our work abstracts an existing highly detailed, biophysically realistic 3D reaction-diffusion model of a chemical synapse to a compact internal state space representation that maps onto parallel neuromorphic hardware for efficient emulation at a very large scale and offers near-equivalence in input-output dynamics while preserving biologically interpretable tunable parameters. 
    more » « less
  4. Abstract Studies of sensory-evoked neuronal responses often focus on mean spike rates, with fluctuations treated as internally-generated noise. However, fluctuations of spontaneous activity, often organized as traveling waves, shape stimulus-evoked responses and perceptual sensitivity. The mechanisms underlying these waves are unknown. Further, it is unclear whether waves are consistent with the low rate and weakly correlated “asynchronous-irregular” dynamics observed in cortical recordings. Here, we describe a large-scale computational model with topographically-organized connectivity and conduction delays relevant to biological scales. We find that spontaneous traveling waves are a general property of these networks. The traveling waves that occur in the model are sparse, with only a small fraction of neurons participating in any individual wave. Consequently, they do not induce measurable spike correlations and remain consistent with locally asynchronous irregular states. Further, by modulating local network state, they can shape responses to incoming inputs as observed in vivo. 
    more » « less
  5. Abstract Replay is the reactivation of one or more neural patterns that are similar to the activation patterns experienced during past waking experiences. Replay was first observed in biological neural networks during sleep, and it is now thought to play a critical role in memory formation, retrieval, and consolidation. Replay-like mechanisms have been incorporated in deep artificial neural networks that learn over time to avoid catastrophic forgetting of previous knowledge. Replay algorithms have been successfully used in a wide range of deep learning methods within supervised, unsupervised, and reinforcement learning paradigms. In this letter, we provide the first comprehensive comparison between replay in the mammalian brain and replay in artificial neural networks. We identify multiple aspects of biological replay that are missing in deep learning systems and hypothesize how they could be used to improve artificial neural networks. 
    more » « less
  6. Nervous systems sense, communicate, compute, and actuate movement using distributed components with severe trade-offs in speed, accuracy, sparsity, noise, and saturation. Nevertheless, brains achieve remarkably fast, accurate, and robust control performance due to a highly effective layered control architecture. Here, we introduce a driving task to study how a mountain biker mitigates the immediate disturbance of trail bumps and responds to changes in trail direction. We manipulated the time delays and accuracy of the control input from the wheel as a surrogate for manipulating the characteristics of neurons in the control loop. The observed speed–accuracy trade-offs motivated a theoretical framework consisting of two layers of control loops—a fast, but inaccurate, reflexive layer that corrects for bumps and a slow, but accurate, planning layer that computes the trajectory to follow—each with components having diverse speeds and accuracies within each physical level, such as nerve bundles containing axons with a wide range of sizes. Our model explains why the errors from two control loops are additive and shows how the errors in each control loop can be decomposed into the errors caused by the limited speeds and accuracies of the components. These results demonstrate that an appropriate diversity in the properties of neurons across layers helps to create “diversity-enabled sweet spots,” so that both fast and accurate control is achieved using slow or inaccurate components.

     
    more » « less
  7. null (Ed.)
    Abstract Long-term depression (LTD) of synaptic strength can take multiple forms and contribute to circuit remodeling, memory encoding or erasure. The generic term LTD encompasses various induction pathways, including activation of NMDA, mGlu or P2X receptors. However, the associated specific molecular mechanisms and effects on synaptic physiology are still unclear. We here compare how NMDAR- or P2XR-dependent LTD affect synaptic nanoscale organization and function in rodents. While both LTDs are associated with a loss and reorganization of synaptic AMPARs, only NMDAR-dependent LTD induction triggers a profound reorganization of PSD-95. This modification, which requires the autophagy machinery to remove the T19-phosphorylated form of PSD-95 from synapses, leads to an increase in AMPAR surface mobility. We demonstrate that these post-synaptic changes that occur specifically during NMDAR-dependent LTD result in an increased short-term plasticity improving neuronal responsiveness of depressed synapses. Our results establish that P2XR- and NMDAR-mediated LTD are associated to functionally distinct forms of LTD. 
    more » « less
  8. The prefrontal cortex encodes and stores numerous, often disparate, schemas and flexibly switches between them. Recent research on artificial neural networks trained by reinforcement learning has made it possible to model fundamental processes underlying schema encoding and storage. Yet how the brain is able to create new schemas while preserving and utilizing old schemas remains unclear. Here we propose a simple neural network framework that incorporates hierarchical gating to model the prefrontal cortex’s ability to flexibly encode and use multiple disparate schemas. We show how gating naturally leads to transfer learning and robust memory savings. We then show how neuropsychological impairments observed in patients with prefrontal damage are mimicked by lesions of our network. Our architecture, which we call DynaMoE, provides a fundamental framework for how the prefrontal cortex may handle the abundance of schemas necessary to navigate the real world.

     
    more » « less