skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on December 1, 2025

Title: An exact mathematical description of computation with transient spatiotemporal dynamics in a complex-valued neural network
Abstract Networks throughout physics and biology leverage spatiotemporal dynamics for computation. However, the connection between structure and computation remains unclear. Here, we study a complex-valued neural network (cv-NN) with linear interactions and phase-delays. We report the cv-NN displays sophisticated spatiotemporal dynamics, which we then use, in combination with a nonlinear readout, for computation. The cv-NN can instantiate dynamics-based logic gates, encode short-term memories, and mediate secure message passing through a combination of interactions and phase-delays. The computations in this system can be fully described in an exact, closed-form mathematical expression. Finally, using direct intracellular recordings of neurons in slices from neocortex, we demonstrate that computations in the cv-NN are decodable by living biological neurons as the nonlinear readout. These results demonstrate that complex-valued linear systems can perform sophisticated computations, while also being exactly solvable. Taken together, these results open future avenues for design of highly adaptable, bio-hybrid computing systems that can interface seamlessly with other neural networks.  more » « less
Award ID(s):
2015276
PAR ID:
10616364
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ;
Publisher / Repository:
na
Date Published:
Journal Name:
Communications Physics
Volume:
7
Issue:
1
ISSN:
2399-3650
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Information coding by precise timing of spikes can be faster and more energy efficient than traditional rate coding. However, spike-timing codes are often brittle, which has limited their use in theoretical neuroscience and computing applications. Here, we propose a type of attractor neural network in complex state space and show how it can be leveraged to construct spiking neural networks with robust computational properties through a phase-to-timing mapping. Building on Hebbian neural associative memories, like Hopfield networks, we first propose threshold phasor associative memory (TPAM) networks. Complex phasor patterns whose components can assume continuous-valued phase angles and binary magnitudes can be stored and retrieved as stable fixed points in the network dynamics. TPAM achieves high memory capacity when storing sparse phasor patterns, and we derive the energy function that governs its fixed-point attractor dynamics. Second, we construct 2 spiking neural networks to approximate the complex algebraic computations in TPAM, a reductionist model with resonate-and-fire neurons and a biologically plausible network of integrate-and-fire neurons with synaptic delays and recurrently connected inhibitory interneurons. The fixed points of TPAM correspond to stable periodic states of precisely timed spiking activity that are robust to perturbation. The link established between rhythmic firing patterns and complex attractor dynamics has implications for the interpretation of spike patterns seen in neuroscience and can serve as a framework for computation in emerging neuromorphic devices. 
    more » « less
  2. One of the simplest mathematical models in the study of nonlinear systems is the Kuramoto model, which describes synchronization in systems from swarms of insects to superconductors. We have recently found a connection between the original, real-valued nonlinear Kuramoto model and a corresponding complex-valued system that permits describing the system in terms of a linear operator and iterative update rule. We now use this description to investigate three major synchronization phenomena in Kuramoto networks (phase synchronization, chimera states, and traveling waves), not only in terms of steady state solutions but also in terms of transient dynamics and individual simulations. These results provide new mathematical insight into how sophisticated behaviors arise from connection patterns in nonlinear networked systems. 
    more » « less
  3. Rainey, Larry B.; Holland, O. Thomas (Ed.)
    Biological neural networks offer some of the most striking and complex examples of emergence ever observed in natural or man-made systems. Individually, the behavior of a single neuron is rather simple, yet these basic building blocks are connected through synapses to form neural networks, which are capable of sophisticated capabilities such as pattern recognition and navigation. Lower-level functionality provided by a given network is combined with other networks to produce more sophisticated capabilities. These capabilities manifest emergently at two vastly different, yet interconnected time scales. At the time scale of neural dynamics, neural networks are responsible for turning noisy external stimuli and internal signals into signals capable of supporting complex computations. A key component in this process is the structure of the network, which itself forms emergently over much longer time scales based on the outputs of its constituent neurons, a process called learning. The analysis and interpretation of the behaviors of these interconnected dynamical systems of neurons should account for the network structure and the collective behavior of the network. The field of graph signal processing (GSP) combines signal processing with network science to study signals defined on irregular network structures. Here, we show that GSP can be a valuable tool in the analysis of emergence in biological neural networks. Beyond any purely scientific pursuits, understanding the emergence in biological neural networks directly impacts the design of more effective artificial neural networks for general machine learning and artificial intelligence tasks across domains, and motivates additional design motifs for novel emergent systems of systems. 
    more » « less
  4. Abstract Implementations of neurons, delays, and synapse circuits are presented with simulations. These neural elements are used to create two small spiking neural networks, the Rate-Window and Order-Biased clusters, which are capable of detecting simple two-spike spatiotemporal patterns. A simple pattern detecting network (SPDN) is created by combining the Rate-Window and Order-Biased clusters, where clusters are small spiking neural networks, and its simple pattern detection ability is demonstrated in simulation. The SPDN is used to implement a complex pattern detecting network (CPDN) and its complex pattern detection ability is demonstrated in simulation. Methods for generating arbitrary spatiotemporal patterns are presented. The CPDN and spatiotemporal pattern generation methods are then used to implement a novel spatiotemporal computing paradigm based on detecting and responding to spatiotemporal symbols. A simulation of a spatiotemporal half adder is presented to demonstrate the computing paradigm. 
    more » « less
  5. Blohm, Gunnar (Ed.)
    Neural circuits consist of many noisy, slow components, with individual neurons subject to ion channel noise, axonal propagation delays, and unreliable and slow synaptic transmission. This raises a fundamental question: how can reliable computation emerge from such unreliable components? A classic strategy is to simply average over a population ofNweakly-coupled neurons to achieve errors that scale as 1 / N . But more interestingly, recent work has introduced networks of leaky integrate-and-fire (LIF) neurons that achieve coding errors that scalesuperclassicallyas 1/Nby combining the principles of predictive coding and fast and tight inhibitory-excitatory balance. However, spike transmission delays preclude such fast inhibition, and computational studies have observed that such delays can cause pathological synchronization that in turn destroys superclassical coding performance. Intriguingly, it has also been observed in simulations that noise can actuallyimprovecoding performance, and that there exists some optimal level of noise that minimizes coding error. However, we lack a quantitative theory that describes this fascinating interplay between delays, noise and neural coding performance in spiking networks. In this work, we elucidate the mechanisms underpinning this beneficial role of noise by derivinganalyticalexpressions for coding error as a function of spike propagation delay and noise levels in predictive coding tight-balance networks of LIF neurons. Furthermore, we compute the minimal coding error and the associated optimal noise level, finding that they grow as power-laws with the delay. Our analysis reveals quantitatively how optimal levels of noise can rescue neural coding performance in spiking neural networks with delays by preventing the build up of pathological synchrony without overwhelming the overall spiking dynamics. This analysis can serve as a foundation for the further study of precise computation in the presence of noise and delays in efficient spiking neural circuits. 
    more » « less