Understanding how neural circuits generate sequential activity is a longstanding challenge. While foundational theoretical models have shown how sequences can be stored as memories in neural networks with Hebbian plasticity rules, these models considered only a narrow range of Hebbian rules. Here, we introduce a model for arbitrary Hebbian plasticity rules, capturing the diversity of spike-timing-dependent synaptic plasticity seen in experiments, and show how the choice of these rules and of neural activity patterns influences sequence memory formation and retrieval. In particular, we derive a general theory that predicts the tempo of sequence replay. This theory lays a foundation for explaining how cortical tutor signals might give rise to motor actions that eventually become “automatic.” Our theory also captures the impact of changing the tempo of the tutor signal. Beyond shedding light on biological circuits, this theory has relevance in artificial intelligence by laying a foundation for frameworks whereby slow and computationally expensive deliberation can be stored as memories and eventually replaced by inexpensive recall.
more »
« less
Long sequence Hopfield memory
Abstract Sequence memory is an essential attribute of natural and artificial intelligence that enables agents to encode, store, and retrieve complex sequences of stimuli and actions. Computational models of sequence memory have been proposed where recurrent Hopfield-like neural networks are trained with temporally asymmetric Hebbian rules. However, these networks suffer from limited sequence capacity (maximal length of the stored sequence) due to interference between the memories. Inspired by recent work on Dense Associative Memories, we expand the sequence capacity of these models by introducing a nonlinear interaction term, enhancing separation between the patterns. We derive novel scaling laws for sequence capacity with respect to network size, significantly outperforming existing scaling laws for models based on traditional Hopfield networks, and verify these theoretical results with numerical simulation. Moreover, we introduce a generalized pseudoinverse rule to recall sequences of highly correlated patterns. Finally, we extend this model to store sequences with variable timing between states’ transitions and describe a biologically-plausible implementation, with connections to motor neuroscience.
more »
« less
- PAR ID:
- 10561214
- Publisher / Repository:
- IOP Publishing Ltd
- Date Published:
- Journal Name:
- Journal of Statistical Mechanics: Theory and Experiment
- Volume:
- 2024
- Issue:
- 10
- ISSN:
- 1742-5468
- Page Range / eLocation ID:
- 104024
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
This work explores the process of adapting the segmented attractor network to a lifelong learning setting. Taking inspirations from Hopfield networks and content-addressable memory, the segmented attractor network is a powerful tool for associative memory applications. The network's performance as an associative memory is analyzed using multiple metrics. In addition to the network's general hit rate, its capability to recall unique memories and their frequency is also evaluated with respect to time. Finally, additional learning techniques are implemented to enhance the network's recall capacity in the application of lifelong learning. These learning techniques are based on human cognitive functions such as memory consolidation, prediction, and forgetting.more » « less
-
Information coding by precise timing of spikes can be faster and more energy efficient than traditional rate coding. However, spike-timing codes are often brittle, which has limited their use in theoretical neuroscience and computing applications. Here, we propose a type of attractor neural network in complex state space and show how it can be leveraged to construct spiking neural networks with robust computational properties through a phase-to-timing mapping. Building on Hebbian neural associative memories, like Hopfield networks, we first propose threshold phasor associative memory (TPAM) networks. Complex phasor patterns whose components can assume continuous-valued phase angles and binary magnitudes can be stored and retrieved as stable fixed points in the network dynamics. TPAM achieves high memory capacity when storing sparse phasor patterns, and we derive the energy function that governs its fixed-point attractor dynamics. Second, we construct 2 spiking neural networks to approximate the complex algebraic computations in TPAM, a reductionist model with resonate-and-fire neurons and a biologically plausible network of integrate-and-fire neurons with synaptic delays and recurrently connected inhibitory interneurons. The fixed points of TPAM correspond to stable periodic states of precisely timed spiking activity that are robust to perturbation. The link established between rhythmic firing patterns and complex attractor dynamics has implications for the interpretation of spike patterns seen in neuroscience and can serve as a framework for computation in emerging neuromorphic devices.more » « less
-
Abstract We show that coherent laser networks (CLNs) exhibit emergent neural computing capabilities. The proposed scheme is built on harnessing the collective behavior of laser networks for storing a number of phase patterns as stable fixed points of the governing dynamical equations and retrieving such patterns through proper excitation conditions, thus exhibiting an associative memory property. It is discussed that despite the large storage capacity of the network, the large overlap between fixed-point patterns effectively limits pattern retrieval to only two images. Next, we show that this restriction can be uplifted by using nonreciprocal coupling between lasers and this allows for utilizing a large storage capacity. This work opens new possibilities for neural computation with coherent laser networks as novel analog processors. In addition, the underlying dynamical model discussed here suggests a novel energy-based recurrent neural network that handles continuous data as opposed to Hopfield networks and Boltzmann machines that are intrinsically binary systems.more » « less
-
Optical photons are powerful carriers of quantum information, which can be delivered in free space by satellites or in fibers on the ground over long distances. Entanglement of quantum states over long distances can empower quantum computing, quantum communications, and quantum sensing. Quantum optical memories are devices designed to store quantum information in the form of stationary excitations, such as atomic coherence, and are capable of coherently mapping these excitations to flying qubits. Quantum memories can effectively store and manipulate quantum states, making them indispensable elements in future long-distance quantum networks. Over the past two decades, quantum optical memories with high fidelities, high efficiencies, long storage times, and promising multiplexing capabilities have been developed, especially at the single-photon level. In this review, we introduce the working principles of commonly used quantum memory protocols and summarize the recent advances in quantum memory demonstrations. We also offer a vision for future quantum optical memory devices that may enable entanglement distribution over long distances.more » « less
An official website of the United States government

