skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Optimal learning with excitatory and inhibitory synapses
Characterizing the relation between weight structure and input/output statistics is fundamental for understanding the computational capabilities of neural circuits. In this work, I study the problem of storing associations between analog signals in the presence of correlations, using methods from statistical mechanics. I characterize the typical learning performance in terms of the power spectrum of random input and output processes. I show that optimal synaptic weight configurations reach a capacity of 0.5 for any fraction of excitatory to inhibitory weights and have a peculiar synaptic distribution with a finite fraction of silent synapses. I further provide a link between typical learning performance and principal components analysis in single cases. These results may shed light on the synaptic profile of brain circuits, such as cerebellar structures, that are thought to engage in processing time-dependent signals and performing on-line prediction.  more » « less
Award ID(s):
1707398
PAR ID:
10248410
Author(s) / Creator(s):
Editor(s):
Morrison, Abigail
Date Published:
Journal Name:
PLOS Computational Biology
Volume:
16
Issue:
12
ISSN:
1553-7358
Page Range / eLocation ID:
e1008536
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The Artificial Intelligence (AI) disruption continues unabated, albeit at extreme compute requirements. Neuromorphic circuits and systems offer a panacea for this extravagance. To this effect, event-based learning such as spike-timing-dependent plasticity (STDP) in spiking neural networks (SNNs) is an active area of research. Hebbian learning in SNNs fundamentally involves synaptic weight updates based on temporal correlations between pre- and post- synaptic neural activities. While there are broadly two approaches of realizing STDP, i.e. All-to-All versus Nearest Neighbor (NN), there exist strong arguments favoring the NN approach on the biologically plausibility front. In this paper, we present a novel current-mode implementation of a postsynaptic event-based NN STDP-based synapse. We leverage transistor subthreshold dynamics to generate exponential STDP traces using repurposed log-domain low-pass filter circuits. Synaptic weight operations involving addition and multiplications are achieved by the Kirchoff current law and the translinear principle respectively. Simulation results from the NCSU TSMC 180 nm technology are presented. Finally, the ideas presented here hold implications for engineering efficient hardware to meet the growing AI training and inference demands. 
    more » « less
  2. The Artificial Intelligence (AI) disruption continues unabated, albeit at extreme compute requirements. Neuromorphic circuits and systems offer a panacea for this extravagance. To this effect, event-based learning such as spike-timing-dependent plasticity (STDP) in spiking neural networks (SNNs) is an active area of research. Hebbian learning in SNNs fundamentally involves synaptic weight updates based on temporal correlations between pre- and post- synaptic neural activities. While there are broadly two approaches of realizing STDP, i.e. All-to-All versus Nearest Neighbor (NN), there exist strong arguments favoring the NN approach on the biologically plausibility front. In this paper, we present a novel current-mode implementation of a postsynaptic event-based NN STDP-based synapse. We leverage transistor subthreshold dynamics to generate exponential STDP traces using repurposed log-domain low-pass filter circuits. Synaptic weight operations involving addition and multiplications are achieved by the Kirchoff current law and the translinear principle respectively. Simulation results from the NCSU TSMC 180 nm technology are presented. Finally, the ideas presented here hold implications for engineering efficient hardware to meet the growing AI training and inference demands. 
    more » « less
  3. Memristive devices based on two-dimensional (2D) materials have emerged as potential synaptic candidates for next-generation neuromorphic computing hardware. Here, we introduce a numerical modeling framework that facilitates efficient exploration of the large parameter space for 2D memristive synaptic devices. High-throughput charge-transport simulations are performed to investigate the voltage pulse characteristics for lateral 2D memristors and synaptic device metrics are studied for different weight-update schemes. We show that the same switching mechanism can lead to fundamentally different pulse characteristics influencing not only the device metrics but also the weight-update direction. A thorough analysis of the parameter space allows simultaneous optimization of the linearity, symmetry, and drift in the synaptic behavior that are related through tradeoffs. The presented modeling framework can serve as a tool for designing 2D memristive devices in practical neuromorphic circuits by providing guidelines for materials properties, device functionality, and system performance for target applications. 
    more » « less
  4. Abstract The tenets of intelligent biological systems are (i) scalable decision-making, (ii) inheritable memory, and (iii) communication. This study aims to increase the complexity of decision-making operations beyond standard Boolean logic, while minimizing the metabolic burden imposed on the chassis cell. To this end, we present a new platform technology for constructing genetic circuits with multiple OUTPUT gene control using fewer INPUTs relative to conventional genetic circuits. Inspired by principles from quantum computing, we engineered synthetic bidirectional promoters, regulated by synthetic transcription factors, to construct 1-INPUT, 2-OUTPUT logical operations—i.e. biological QUBIT and PAULI-X logic gates—designed as compressed genetic circuits. We then layered said gates to engineer additional quantum-inspired logical operations of increasing complexity—e.g. FEYNMAN and TOFFOLI gates. In addition, we engineered a 2-INPUT, 4-OUTPUT quantum operation to showcase the capacity to utilize the entire permutation INPUT space. Finally, we developed a recombinase-based memory operation to remap the truth table between two disparate logic gates—i.e. converting a QUBIT operation to an antithetical PAULI-X operation in situ. This study introduces a novel and versatile synthetic biology toolkit, which expands the biocomputing capacity of Transcriptional Programming via the development of compressed and scalable multi-INPUT/OUTPUT logical operations. 
    more » « less
  5. Machado, Pedro (Ed.)
    This study emulates associative learning in rodents by using a neuromorphic robot navigating an open-field arena. The goal is to investigate how biologically inspired neural models can reproduce animal-like learning behaviors in real-world robotic systems. We constructed a neuromorphic robot by deploying computational models of spatial and sensory neurons onto a mobile platform. Different coding schemes—rate coding for vibration signals and population coding for visual signals—were implemented. The associative learning model employs 19 spiking neurons and follows Hebbian plasticity principles to associate visual cues with favorable or unfavorable locations. Our robot successfully replicated classical rodent associative learning behavior by memorizing causal relationships between environmental cues and spatial outcomes. The robot’s self-learning capability emerged from repeated exposure and synaptic weight adaptation, without the need for labeled training data. Experiments confirmed functional learning behavior across multiple trials. This work provides a novel embodied platform for memory and learning research beyond traditional animal models. By embedding biologically inspired learning mechanisms into a real robot, we demonstrate how spatial memory can be formed and expressed through sensorimotor interactions. The model’s compact structure (19 neurons) illustrates a minimal yet functional learning network, and the study outlines principles for synaptic weight and threshold design, guiding future development of more complex neuromorphic systems. 
    more » « less