Many controlledin vitrostudies have demonstrated how postsynaptic responses to presynaptic spikes are not constant but depend on short-term synaptic plasticity (STP) and the detailed timing of presynaptic spikes. However, the effects of short-term plasticity (depression and facilitation) are not limited to short, subsecond timescales. The effects of STP appear on long timescales as changes in presynaptic firing rates lead to changes in steady-state synaptic transmission. Here, we examine the relationship between natural variations in the presynaptic firing rates and spike transmissionin vivo. Using large-scale spike recordings in awake male and female mice from the Allen Institute Neuropixels dataset, we first detect putative excitatory synaptic connections based on cross-correlations between the spike trains of millions of pairs of neurons. For the subset of pairs where a transient, excitatory effect was detected, we use a model-based approach to track fluctuations in synaptic efficacy and find that efficacy varies substantially on slow (∼1 min) timescales over the course of these recordings. For many connections, the efficacy fluctuations are correlated with fluctuations in the presynaptic firing rate. To understand the potential mechanisms underlying this relationship, we then model the detailed probability of postsynaptic spiking on a millisecond timescale, including both slow changes in postsynaptic excitability and monosynaptic inputs with short-term plasticity. The detailed model reproduces the slow efficacy fluctuations observed with many putative excitatory connections, suggesting that these fluctuations can be both directly predicted based on the time-varying presynaptic firing rate and, at least partly, explained by the cumulative effects of STP. SIGNIFICANCE STATEMENTThe firing rates of individual neurons naturally vary because of stimuli, movement, and brain state. Models of synaptic transmission predict that these variations in firing rates should be accompanied by slow fluctuations in synaptic strength because of short-term depression and facilitation. Here, we characterize the magnitude and predictability of fluctuations in synaptic strengthin vivousing large-scale spike recordings. For putative excitatory connections from a wide range of brain areas, we find that typical synaptic efficacy varies as much as ∼70%, and in many cases the fluctuations are well described by models of short-term synaptic plasticity. These results highlight the dynamic nature ofin vivosynaptic transmission and the interplay between synaptic strength and firing rates in awake animals.
more »
« less
LODeNNS: A Linearly-approximated and Optimized Dendrocentric Nearest Neighbor STDP
Realizing Hebbian plasticity in large-scale neuromorphic systems is essential for reconfiguring them for recognition tasks. Spike-timing-dependent plasticity, as a tool to this effect, has received a lot of attention in recent times. This phenomenon encodes weight update information as correlations between the presynaptic and postsynaptic event times, as such, it is imperative for each synapse in a silicon neural network to somehow keep its own time. We present a biologically plausible and optimized Register Transfer Level (RTL) and algorithmic approach to the Nearest-Neighbor STDP with time management handled by the postsynaptic dendrite. We adopt a time-constant based ramp approximation for ease of RTL implementation and incorporation in large-scale digital neuromorphic systems.
more »
« less
- Award ID(s):
- 1824198
- PAR ID:
- 10376848
- Date Published:
- Journal Name:
- ICONS '22: Proceedings of the International Conference on Neuromorphic Systems 2022
- Page Range / eLocation ID:
- 1 to 8
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The Artificial Intelligence (AI) disruption continues unabated, albeit at extreme compute requirements. Neuromorphic circuits and systems offer a panacea for this extravagance. To this effect, event-based learning such as spike-timing-dependent plasticity (STDP) in spiking neural networks (SNNs) is an active area of research. Hebbian learning in SNNs fundamentally involves synaptic weight updates based on temporal correlations between pre- and post- synaptic neural activities. While there are broadly two approaches of realizing STDP, i.e. All-to-All versus Nearest Neighbor (NN), there exist strong arguments favoring the NN approach on the biologically plausibility front. In this paper, we present a novel current-mode implementation of a postsynaptic event-based NN STDP-based synapse. We leverage transistor subthreshold dynamics to generate exponential STDP traces using repurposed log-domain low-pass filter circuits. Synaptic weight operations involving addition and multiplications are achieved by the Kirchoff current law and the translinear principle respectively. Simulation results from the NCSU TSMC 180 nm technology are presented. Finally, the ideas presented here hold implications for engineering efficient hardware to meet the growing AI training and inference demands.more » « less
-
The Artificial Intelligence (AI) disruption continues unabated, albeit at extreme compute requirements. Neuromorphic circuits and systems offer a panacea for this extravagance. To this effect, event-based learning such as spike-timing-dependent plasticity (STDP) in spiking neural networks (SNNs) is an active area of research. Hebbian learning in SNNs fundamentally involves synaptic weight updates based on temporal correlations between pre- and post- synaptic neural activities. While there are broadly two approaches of realizing STDP, i.e. All-to-All versus Nearest Neighbor (NN), there exist strong arguments favoring the NN approach on the biologically plausibility front. In this paper, we present a novel current-mode implementation of a postsynaptic event-based NN STDP-based synapse. We leverage transistor subthreshold dynamics to generate exponential STDP traces using repurposed log-domain low-pass filter circuits. Synaptic weight operations involving addition and multiplications are achieved by the Kirchoff current law and the translinear principle respectively. Simulation results from the NCSU TSMC 180 nm technology are presented. Finally, the ideas presented here hold implications for engineering efficient hardware to meet the growing AI training and inference demands.more » « less
-
Abstract Spiking neural network (SNN) in future neuromorphic architectures requires hardware devices to be not only capable of emulating fundamental functionalities of biological synapse such as spike-timing dependent plasticity (STDP) and spike-rate dependent plasticity (SRDP), but also biodegradable to address current ecological challenges of electronic waste. Among different device technologies and materials, memristive synaptic devices based on natural organic materials have emerged as the favourable candidate to meet these demands. The metal–insulator-metal structure is analogous to biological synapse with low power consumption, fast switching speed and simulation of synaptic plasticity, while natural organic materials are water soluble, renewable and environmental friendly. In this study, the potential of a natural organic material—honey-based memristor for SNNs was demonstrated. The device exhibited forming-free bipolar resistive switching, a high switching speed of 100 ns set time and 500 ns reset time, STDP and SRDP learning behaviours, and dissolving in water. The intuitive conduction models for STDP and SRDP were proposed. These results testified that honey-based memristive synaptic devices are promising for SNN implementation in green electronics and biodegradable neuromorphic systems.more » « less
-
Abstract Spike-timing-dependent plasticity (STDP) is an unsupervised learning mechanism for spiking neural networks that has received significant attention from the neuromorphic hardware community. However, scaling such local learning techniques to deeper networks and large-scale tasks has remained elusive. In this work, we investigate a Deep-STDP framework where a rate-based convolutional network, that can be deployed in a neuromorphic setting, is trained in tandem with pseudo-labels generated by the STDP clustering process on the network outputs. We achieve 24.56% higher accuracy and 3.5 × faster convergence speed at iso-accuracy on a 10-class subset of the Tiny ImageNet dataset in contrast to ak-means clustering approach.more » « less
An official website of the United States government

