Resistive switching is a promising technology for artificial synapses, the most critical component and building block of a neural network for brain-inspired neuromorphic computing. The artificial synapse is capable of emulating a signal process and memory functions of biological synapses. The artificial synapse fabricated by natural bioorganic materials is essential for developing soft, flexible, and biocompatible electronics and sustainable, biodegradable, and environmentally friendly neuromorphic systems. In this work, a natural biomaterial—honey based resistive switching device—was demonstrated to emulate some important functionalities of biological synapses, including synaptic potentiation and depression, short-term and long-term memory, spatial summation, and shunting inhibition. The results indicate the potential of honey based resistive switching for artificial synaptic devices in renewable neuromorphic systems and bioelectronics.
more »
« less
On-device synaptic memory consolidation using Fowler-Nordheim quantum-tunneling
Introduction For artificial synapses whose strengths are assumed to be bounded and can only be updated with finite precision, achieving optimal memory consolidation using primitives from classical physics leads to synaptic models that are too complex to be scaled in-silico . Here we report that a relatively simple differential device that operates using the physics of Fowler-Nordheim (FN) quantum-mechanical tunneling can achieve tunable memory consolidation characteristics with different plasticity-stability trade-offs. Methods A prototype FN-synapse array was fabricated in a standard silicon process and was used to verify the optimal memory consolidation characteristics and used for estimating the parameters of an FN-synapse analytical model. The analytical model was then used for large-scale memory consolidation and continual learning experiments. Results We show that compared to other physical implementations of synapses for memory consolidation, the operation of the FN-synapse is near-optimal in terms of the synaptic lifetime and the consolidation properties. We also demonstrate that a network comprising FN-synapses outperforms a comparable elastic weight consolidation (EWC) network for some benchmark continual learning tasks. Discussions With an energy footprint of femtojoules per synaptic update, we believe that the proposed FN-synapse provides an ultra-energy-efficient approach for implementing both synaptic memory consolidation and continual learning on a physical device.
more »
« less
- PAR ID:
- 10438311
- Date Published:
- Journal Name:
- Frontiers in Neuroscience
- Volume:
- 16
- ISSN:
- 1662-453X
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Anomaly detection in real-time using autoencoders implemented on edge devices is exceedingly challenging due to limited hardware, energy, and computational resources. We show that these limitations can be addressed by designing an autoencoder with low-resolution non-volatile memory-based synapses and employing an effective quantized neural network learning algorithm. We further propose nanoscale ferromagnetic racetracks with engineered notches hosting magnetic domain walls (DW) as exemplary non-volatile memory-based autoencoder synapses, where limited state (5-state) synaptic weights are manipulated by spin orbit torque (SOT) current pulses to write different magnetoresistance states. The performance of anomaly detection of the proposed autoencoder model is evaluated on the NSL-KDD dataset. Limited resolution and DW device stochasticity aware training of the autoencoder is performed, which yields comparable anomaly detection performance to the autoencoder having floating-point precision weights. While the limited number of quantized states and the inherent stochastic nature of DW synaptic weights in nanoscale devices are typically known to negatively impact the performance, our hardware-aware training algorithm is shown to leverage these imperfect device characteristics to generate an improvement in anomaly detection accuracy (90.98%) compared to accuracy obtained with floating-point synaptic weights that are extremely memory intensive. Furthermore, our DW-based approach demonstrates a remarkable reduction of at least three orders of magnitude in weight updates during training compared to the floating-point approach, implying significant reduction in operation energy for our method. This work could stimulate the development of extremely energy efficient non-volatile multi-state synapse-based processors that can perform real-time training and inference on the edge with unsupervised data.more » « less
-
Abstract Variation in the strength of synapses can be quantified by measuring the anatomical properties of synapses. Quantifying precision of synaptic plasticity is fundamental to understanding information storage and retrieval in neural circuits. Synapses from the same axon onto the same dendrite have a common history of coactivation, making them ideal candidates for determining the precision of synaptic plasticity based on the similarity of their physical dimensions. Here, the precision and amount of information stored in synapse dimensions were quantified with Shannon information theory, expanding prior analysis that used signal detection theory (Bartol et al., 2015). The two methods were compared using dendritic spine head volumes in the middle of the stratum radiatum of hippocampal area CA1 as well-defined measures of synaptic strength. Information theory delineated the number of distinguishable synaptic strengths based on nonoverlapping bins of dendritic spine head volumes. Shannon entropy was applied to measure synaptic information storage capacity (SISC) and resulted in a lower bound of 4.1 bits and upper bound of 4.59 bits of information based on 24 distinguishable sizes. We further compared the distribution of distinguishable sizes and a uniform distribution using Kullback-Leibler divergence and discovered that there was a nearly uniform distribution of spine head volumes across the sizes, suggesting optimal use of the distinguishable values. Thus, SISC provides a new analytical measure that can be generalized to probe synaptic strengths and capacity for plasticity in different brain regions of different species and among animals raised in different conditions or during learning. How brain diseases and disorders affect the precision of synaptic plasticity can also be probed.more » « less
-
null (Ed.)Abstract Optical data sensing, processing and visual memory are fundamental requirements for artificial intelligence and robotics with autonomous navigation. Traditionally, imaging has been kept separate from the pattern recognition circuitry. Optoelectronic synapses hold the special potential of integrating these two fields into a single layer, where a single device can record optical data, convert it into a conductance state and store it for learning and pattern recognition, similar to the optic nerve in human eye. In this work, the trapping and de-trapping of photogenerated carriers in the MoS 2 /SiO 2 interface of a n-channel MoS 2 transistor was employed to emulate the optoelectronic synapse characteristics. The monolayer MoS 2 field effect transistor (FET) exhibits photo-induced short-term and long-term potentiation, electrically driven long-term depression, paired pulse facilitation (PPF), spike time dependent plasticity, which are necessary synaptic characteristics. Moreover, the device’s ability to retain its conductance state can be modulated by the gate voltage, making the device behave as a photodetector for positive gate voltages and an optoelectronic synapse at negative gate voltages.more » « less
-
Abstract In neuromorphic computing, artificial synapses provide a multi‐weight (MW) conductance state that is set based on inputs from neurons, analogous to the brain. Herein, artificial synapses based on magnetic materials that use a magnetic tunnel junction (MTJ) and a magnetic domain wall (DW) are explored. By fabricating lithographic notches in a DW track underneath a single MTJ, 3–5 stable resistance states that can be repeatably controlled electrically using spin‐orbit torque are achieved. The effect of geometry on the synapse behavior is explored, showing that a trapezoidal device has asymmetric weight updates with high controllability, while a rectangular device has higher stochasticity, but with stable resistance levels. The device data is input into neuromorphic computing simulators to show the usefulness of application‐specific synaptic functions. Implementing an artificial neural network (NN) applied to streamed Fashion‐MNIST data, the trapezoidal magnetic synapse can be used as a metaplastic function for efficient online learning. Implementing a convolutional NN for CIFAR‐100 image recognition, the rectangular magnetic synapse achieves near‐ideal inference accuracy, due to the stability of its resistance levels. This work shows MW magnetic synapses are a feasible technology for neuromorphic computing and provides design guidelines for emerging artificial synapse technologies.more » « less
An official website of the United States government

