We report the first experimental demonstration of ferroelectric field-effect transistor (FEFET) based spiking neurons. A unique feature of the ferroelectric (FE) neuron demonstrated herein is the availability of both excitatory and inhibitory input connections in the compact 1T-1FEFET structure, which is also reported for the first time for any neuron implementations. Such dual neuron functionality is a key requirement for bio-mimetic neural networks and represents a breakthrough for implementation of the third generation spiking neural networks (SNNs)-also reported herein for unsupervised learning and clustering on real world data for the first time. The key to our demonstration is the careful design of two important device level features: (1) abrupt hysteretic transitions of the FEFET with no stable states therein, and (2) the dynamic tunability of the FEFET hysteresis by bias conditions which allows for the inhibition functionality. Experimentally calibrated, multi-domain Preisach based FEFET models were used to accurately simulate the FE neurons and project their performance at scaled nodes. We also implement an SNN for unsupervised clustering and benchmark the network performance across analog CMOS and emerging technologies and observe (1) unification of excitatory and inhibitory neural connections, (2) STDP based learning, (3) lowest reported power (3.6nW) during classification, and (4) a classification accuracy of 93%.
more »
« less
Spiking Neural Networks with Laterally-Inhibited Self-Recurrent Units
In biological brains, recurrent connections play a crucial role in cortical computation, modulation of network dynamics, and communication. However, in recurrent spiking neural networks (SNNs), recurrence is mostly constructed by random connections. How excitatory and inhibitory recurrent connections affect network responses and what kinds of connectivity benefit learning performance is still obscure. In this work, we propose a novel recurrent structure called the Laterally-Inhibited Self-Recurrent Unit (LISR), which consists of one excitatory neuron with a self-recurrent connection wired together with an inhibitory neuron through excitatory and inhibitory synapses. The self-recurrent connection of the excitatory neuron mitigates the information loss caused by the firing-and-resetting mechanism and maintains the long-term neuronal memory. The lateral inhibition from the inhibitory neuron to the corresponding excitatory neuron, on the one hand, adjusts the firing activity of the latter. On the other hand, it plays as a forget gate to clear the memory of the excitatory neuron. Based on speech and image datasets commonly used in neuromorphic computing, RSNNs based on the proposed LISR improve performance significantly by up to 9.26% over feedforward SNNs trained by a state-of-the-art backpropagation method with similar computational costs.
more »
« less
- Award ID(s):
- 1948201
- PAR ID:
- 10290847
- Date Published:
- Journal Name:
- 2021 International Joint Conference on Neural Networks
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Abstract As an important class of spiking neural networks (SNNs), recurrent spiking neural networks (RSNNs) possess great computational power and have been widely used for processing sequential data like audio and text. However, most RSNNs suffer from two problems. First, due to the lack of architectural guidance, random recurrent connectivity is often adopted, which does not guarantee good performance. Second, training of RSNNs is in general challenging, bottlenecking achievable model accuracy. To address these problems, we propose a new type of RSNN, skip-connected self-recurrent SNNs (ScSr-SNNs). Recurrence in ScSr-SNNs is introduced by adding self-recurrent connections to spiking neurons. The SNNs with self-recurrent connections can realize recurrent behaviors similar to those of more complex RSNNs, while the error gradients can be more straightforwardly calculated due to the mostly feedforward nature of the network. The network dynamics is enriched by skip connections between nonadjacent layers. Moreover, we propose a new backpropagation (BP) method, backpropagated intrinsic plasticity (BIP), to boost the performance of ScSr-SNNs further by training intrinsic model parameters. Unlike standard intrinsic plasticity rules that adjust the neuron's intrinsic parameters according to neuronal activity, the proposed BIP method optimizes intrinsic parameters based on the backpropagated error gradient of a well-defined global loss function in addition to synaptic weight training. Based on challenging speech, neuromorphic speech, and neuromorphic image data sets, the proposed ScSr-SNNs can boost performance by up to 2.85% compared with other types of RSNNs trained by state-of-the-art BP methods.more » « less
-
The notion that a neuron transmits the same set of neurotransmitters at all of its post-synaptic connections, typically known as Dale's law, is well supported throughout the majority of the brain and is assumed in almost all theoretical studies investigating the mechanisms for computation in neuronal networks. Dale's law has numerous functional implications in fundamental sensory processing and decision-making tasks, and it plays a key role in the current understanding of the structure-function relationship in the brain. However, since exceptions to Dale's law have been discovered for certain neurons and because other biological systems with complex network structure incorporate individual units that send both positive and negative feedback signals, we investigate the functional implications of network model dynamics that violate Dale's law by allowing each neuron to send out both excitatory and inhibitory signals to its neighbors. We show how balanced network dynamics, in which large excitatory and inhibitory inputs are dynamically adjusted such that input fluctuations produce irregular firing events, are theoretically preserved for a single population of neurons violating Dale's law. We further leverage this single-population network model in the context of two competing pools of neurons to demonstrate that effective decision-making dynamics are also produced, agreeing with experimental observations from honeybee dynamics in selecting a food source and artificial neural networks trained in optimal selection. Through direct comparison with the classical two-population balanced neuronal network, we argue that the one-population network demonstrates more robust balanced activity for systems with less computational units, such as honeybee colonies, whereas the two-population network exhibits a more rapid response to temporal variations in network inputs, as required by the brain. We expect this study will shed light on the role of neurons violating Dale's law found in experiment as well as shared design principles across biological systems that perform complex computations.more » « less
-
Abstract A novel high-fan-in differential superconductor neuron structure designed for ultra-high-performance spiking neural network (SNN) accelerators is presented. Utilizing a high-fan-in neuron structure allows us to design SNN accelerators with more synaptic connections, enhancing the overall network capabilities. The proposed neuron design is based on superconductor electronics fabric, incorporating multiple superconducting loops, each with two Josephson Junctions. This arrangement enables each input data branch to have positive and negative inductive coupling, supporting excitatory and inhibitory synaptic data. Compatibility with synaptic devices and thresholding operation is achieved using a single flux quantum pulse-based logic style. The neuron design, along with ternary synaptic connections, forms the foundation for a superconductor-based SNN inference. To demonstrate the capabilities of our design, we train the SNN using snnTorch, augmenting the PyTorch framework. After pruning, the demonstrated SNN inference achieves an impressive 96.1% accuracy on MNIST images. Notably, the network exhibits a remarkable throughput of 8.92 GHz while consuming only 1.5 nJ per inference, including the energy consumption associated with cooling to 4 K. These results underscore the potential of superconductor electronics in developing high-performance and ultra-energy-efficient neural network accelerator architectures.more » « less
-
Spiking neural networks (SNNs) are well suited for spatio-temporal learning and implementations on energy-efficient event-driven neuromorphic processors. However, existing SNN error backpropagation (BP) methods lack proper handling of spiking discontinuities and suffer from low performance compared with the BP methods for traditional artificial neural networks. In addition, a large number of time steps are typically required to achieve decent performance, leading to high latency and rendering spike based computation unscalable to deep architectures. We present a novel Temporal Spike Sequence Learning Backpropagation (TSSL-BP) method for training deep SNNs, which breaks down error backpropagation across two types of inter-neuron and intra-neuron dependencies and leads to improved temporal learning precision. It captures inter-neuron dependencies through presynaptic firing times by considering the all-or-none characteristics of firing activities, and captures intra-neuron dependencies by handling the internal evolution of each neuronal state in time. TSSL-BP efficiently trains deep SNNs within a much shortened temporal window of a few steps while improving the accuracy for various image classification datasets including CIFAR10.more » « less