Abstract Memristors for neuromorphic computing have gained prominence over the years for implementing synapses and neurons due to their nano-scale footprint and reduced complexity. Several demonstrations show two-dimensional (2D) materials as a promising platform for the realization of transparent, flexible, ultra-thin memristive synapses. However, unsupervised learning in a spiking neural network (SNN) facilitated by linearity and symmetry in synaptic weight update has not been explored thoroughly using the 2D materials platform. Here, we demonstrate that graphene/MoS2/SiOx/Ni synapses exhibit ideal linearity and symmetry when subjected to identical input pulses, which is essential for their role in online training of neural networks. The linearity in weight update holds for a range of pulse width, amplitude and number of applied pulses. Our work illustrates that the mechanism of switching in MoS2-based synapses is through conductive filaments governed by Poole-Frenkel emission. We demonstrate that the graphene/MoS2/SiOx/Ni synapses, when integrated with a MoS2-based leaky integrate-and-fire neuron, can control the spiking of the neuron efficiently. This work establishes 2D MoS2as a viable platform for all-memristive SNNs. 
                        more » 
                        « less   
                    
                            
                            Two-dimensional materials-based probabilistic synapses and reconfigurable neurons for measuring inference uncertainty using Bayesian neural networks
                        
                    
    
            Abstract Artificial neural networks have demonstrated superiority over traditional computing architectures in tasks such as pattern classification and learning. However, they do not measure uncertainty in predictions, and hence they can make wrong predictions with high confidence, which can be detrimental for many mission-critical applications. In contrast, Bayesian neural networks (BNNs) naturally include such uncertainty in their model, as the weights are represented by probability distributions (e.g. Gaussian distribution). Here we introduce three-terminal memtransistors based on two-dimensional (2D) materials, which can emulate both probabilistic synapses as well as reconfigurable neurons. The cycle-to-cycle variation in the programming of the 2D memtransistor is exploited to achieve Gaussian random number generator-based synapses, whereas 2D memtransistor based integrated circuits are used to obtain neurons with hyperbolic tangent and sigmoid activation functions. Finally, memtransistor-based synapses and neurons are combined in a crossbar array architecture to realize a BNN accelerator for a data classification task. 
        more » 
        « less   
        
    
    
                            - PAR ID:
- 10388514
- Date Published:
- Journal Name:
- Nature Communications
- Volume:
- 13
- Issue:
- 1
- ISSN:
- 2041-1723
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Abstract Bayesian networks (BNs) find widespread application in many real-world probabilistic problems including diagnostics, forecasting, computer vision, etc. The basic computing primitive for BNs is a stochastic bit (s-bit) generator that can control the probability of obtaining ‘1’ in a binary bit-stream. While silicon-based complementary metal-oxide-semiconductor (CMOS) technology can be used for hardware implementation of BNs, the lack of inherent stochasticity makes it area and energy inefficient. On the other hand, memristors and spintronic devices offer inherent stochasticity but lack computing ability beyond simple vector matrix multiplication due to their two-terminal nature and rely on extensive CMOS peripherals for BN implementation, which limits area and energy efficiency. Here, we circumvent these challenges by introducing a hardware platform based on 2D memtransistors. First, we experimentally demonstrate a low-power and compact s-bit generator circuit that exploits cycle-to-cycle fluctuation in the post-programmed conductance state of 2D memtransistors. Next, the s-bit generators are monolithically integrated with 2D memtransistor-based logic gates to implement BNs. Our findings highlight the potential for 2D memtransistor-based integrated circuits for non-von Neumann computing applications.more » « less
- 
            Ruslan Salakhutdinov (Ed.)This paper develops simple feed-forward neural networks that achieve the universal approximation property for all continuous functions with a fixed finite number of neurons. These neural networks are simple because they are designed with a simple and computable continuous activation function $$\sigma$$ leveraging a triangular-wave function and a softsign function. We prove that $$\sigma$$-activated networks with width $36d(2d+1)$ and depth $11$ can approximate any continuous function on a $$d$$-dimensional hypercube within an arbitrarily small error. Hence, for supervised learning and its related regression problems, the hypothesis space generated by these networks with a size not smaller than $$36d(2d+1)\times 11$$ is dense in the space of continuous functions. Furthermore, classification functions arising from image and signal classification are in the hypothesis space generated by $$\sigma$$-activated networks with width $36d(2d+1)$ and depth $12$, when there exist pairwise disjoint closed bounded subsets of $$\mathbb{R}^d$$ such that the samples of the same class are located in the same subset.more » « less
- 
            Implementing scalable and effective synaptic networks will enable neuromorphic computing to deliver on its promise of revolutionizing computing. RRAM represents the most promising technology for realizing the fully connected synapse network: By using programmable resistive elements as weights, RRAM can modulate the strength of synapses in a neural network architecture. Oscillatory Neural Networks (ONNs) that are based on phase-locked loop (PLL) neurons are compatible with the resistive synapses but otherwise rather impractical. In this paper, A PLL-free ONN is implemented in 28 nm CMOS and compared to its PLL-based counterpart. Our silicon results show that the PLL-free architecture is compatible with resistive synapses, addresses practical implementation issues for improved robustness, and demonstrates favorable energy consumption compared to state-of-the-art NNs.more » « less
- 
            Kurtz, Jurgen (Ed.)In neuroscience, delayed synaptic activity plays a pivotal and pervasive role in influencing synchronization, oscillation, and information-processing properties of neural networks. In small rhythm-generating networks, such as central pattern generators (CPGs), time-delays may regulate and determine the stability and variability of rhythmic activity, enabling organisms to adapt to environmental changes, and coordinate diverse locomotion patterns in both function and dysfunction. Here, we examine the dynamics of a three-cell CPG model in which time-delays are introduced into reciprocally inhibitory synapses between constituent neurons. We employ computational analysis to investigate the multiplicity and robustness of various rhythms observed in such multi-modal neural networks. Our approach involves deriving exhaustive two-dimensional Poincaré return maps for phase-lags between constituent neurons, where stable fixed points and invariant curves correspond to various phase-locked and phase-slipping/jitter rhythms. These rhythms emerge and disappear through various local (saddle-node, torus) and non-local (homoclinic) bifurcations, highlighting the multi-functionality (modality) observed in such small neural networks with fast inhibitory synapses.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    