Abstract Similar activity patterns may arise from model neural networks with distinct coupling properties and individual unit dynamics. These similar patterns may, however, respond differently to parameter variations and specifically to tuning of inputs that represent control signals. In this work, we analyze the responses resulting from modulation of a localized input in each of three classes of model neural networks that have been recognized in the literature for their capacity to produce robust three-phase rhythms: coupled fast-slow oscillators, near-heteroclinic oscillators, and threshold-linear networks. Triphasic rhythms, in which each phase consists of a prolonged activation of a corresponding subgroup of neurons followed by a fast transition to another phase, represent a fundamental activity pattern observed across a range of central pattern generators underlying behaviors critical to survival, including respiration, locomotion, and feeding. To perform our analysis, we extend the recently developed local timing response curve (lTRC), which allows us to characterize the timing effects due to perturbations, and we complement our lTRC approach with model-specific dynamical systems analysis. Interestingly, we observe disparate effects of similar perturbations across distinct model classes. Thus, this work provides an analytical framework for studying control of oscillations in nonlinear dynamical systems and may help guide model selection in future efforts to study systems exhibiting triphasic rhythmic activity.
more »
« less
This content will become publicly available on April 1, 2026
Synaptic delays shape dynamics and function in multimodal neural motifs
In neuroscience, delayed synaptic activity plays a pivotal and pervasive role in influencing synchronization, oscillation, and information-processing properties of neural networks. In small rhythm-generating networks, such as central pattern generators (CPGs), time-delays may regulate and determine the stability and variability of rhythmic activity, enabling organisms to adapt to environmental changes, and coordinate diverse locomotion patterns in both function and dysfunction. Here, we examine the dynamics of a three-cell CPG model in which time-delays are introduced into reciprocally inhibitory synapses between constituent neurons. We employ computational analysis to investigate the multiplicity and robustness of various rhythms observed in such multi-modal neural networks. Our approach involves deriving exhaustive two-dimensional Poincaré return maps for phase-lags between constituent neurons, where stable fixed points and invariant curves correspond to various phase-locked and phase-slipping/jitter rhythms. These rhythms emerge and disappear through various local (saddle-node, torus) and non-local (homoclinic) bifurcations, highlighting the multi-functionality (modality) observed in such small neural networks with fast inhibitory synapses.
more »
« less
- Award ID(s):
- 2407999
- PAR ID:
- 10597750
- Editor(s):
- Kurtz, Jurgen
- Publisher / Repository:
- AIP
- Date Published:
- Journal Name:
- Chaos: An Interdisciplinary Journal of Nonlinear Science
- Volume:
- 35
- Issue:
- 4
- ISSN:
- 1054-1500
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Blohm, Gunnar (Ed.)Neural circuits consist of many noisy, slow components, with individual neurons subject to ion channel noise, axonal propagation delays, and unreliable and slow synaptic transmission. This raises a fundamental question: how can reliable computation emerge from such unreliable components? A classic strategy is to simply average over a population ofNweakly-coupled neurons to achieve errors that scale as . But more interestingly, recent work has introduced networks of leaky integrate-and-fire (LIF) neurons that achieve coding errors that scalesuperclassicallyas 1/Nby combining the principles of predictive coding and fast and tight inhibitory-excitatory balance. However, spike transmission delays preclude such fast inhibition, and computational studies have observed that such delays can cause pathological synchronization that in turn destroys superclassical coding performance. Intriguingly, it has also been observed in simulations that noise can actuallyimprovecoding performance, and that there exists some optimal level of noise that minimizes coding error. However, we lack a quantitative theory that describes this fascinating interplay between delays, noise and neural coding performance in spiking networks. In this work, we elucidate the mechanisms underpinning this beneficial role of noise by derivinganalyticalexpressions for coding error as a function of spike propagation delay and noise levels in predictive coding tight-balance networks of LIF neurons. Furthermore, we compute the minimal coding error and the associated optimal noise level, finding that they grow as power-laws with the delay. Our analysis reveals quantitatively how optimal levels of noise can rescue neural coding performance in spiking neural networks with delays by preventing the build up of pathological synchrony without overwhelming the overall spiking dynamics. This analysis can serve as a foundation for the further study of precise computation in the presence of noise and delays in efficient spiking neural circuits.more » « less
-
Information coding by precise timing of spikes can be faster and more energy efficient than traditional rate coding. However, spike-timing codes are often brittle, which has limited their use in theoretical neuroscience and computing applications. Here, we propose a type of attractor neural network in complex state space and show how it can be leveraged to construct spiking neural networks with robust computational properties through a phase-to-timing mapping. Building on Hebbian neural associative memories, like Hopfield networks, we first propose threshold phasor associative memory (TPAM) networks. Complex phasor patterns whose components can assume continuous-valued phase angles and binary magnitudes can be stored and retrieved as stable fixed points in the network dynamics. TPAM achieves high memory capacity when storing sparse phasor patterns, and we derive the energy function that governs its fixed-point attractor dynamics. Second, we construct 2 spiking neural networks to approximate the complex algebraic computations in TPAM, a reductionist model with resonate-and-fire neurons and a biologically plausible network of integrate-and-fire neurons with synaptic delays and recurrently connected inhibitory interneurons. The fixed points of TPAM correspond to stable periodic states of precisely timed spiking activity that are robust to perturbation. The link established between rhythmic firing patterns and complex attractor dynamics has implications for the interpretation of spike patterns seen in neuroscience and can serve as a framework for computation in emerging neuromorphic devices.more » « less
-
Rainey, Larry B.; Holland, O. Thomas (Ed.)Biological neural networks offer some of the most striking and complex examples of emergence ever observed in natural or man-made systems. Individually, the behavior of a single neuron is rather simple, yet these basic building blocks are connected through synapses to form neural networks, which are capable of sophisticated capabilities such as pattern recognition and navigation. Lower-level functionality provided by a given network is combined with other networks to produce more sophisticated capabilities. These capabilities manifest emergently at two vastly different, yet interconnected time scales. At the time scale of neural dynamics, neural networks are responsible for turning noisy external stimuli and internal signals into signals capable of supporting complex computations. A key component in this process is the structure of the network, which itself forms emergently over much longer time scales based on the outputs of its constituent neurons, a process called learning. The analysis and interpretation of the behaviors of these interconnected dynamical systems of neurons should account for the network structure and the collective behavior of the network. The field of graph signal processing (GSP) combines signal processing with network science to study signals defined on irregular network structures. Here, we show that GSP can be a valuable tool in the analysis of emergence in biological neural networks. Beyond any purely scientific pursuits, understanding the emergence in biological neural networks directly impacts the design of more effective artificial neural networks for general machine learning and artificial intelligence tasks across domains, and motivates additional design motifs for novel emergent systems of systems.more » « less
-
Rubin, Jonathan (Ed.)Theta and gamma rhythms and their cross-frequency coupling play critical roles in perception, attention, learning, and memory. Available data suggest that forebrain acetylcholine (ACh) signaling promotes theta-gamma coupling, although the mechanism has not been identified. Recent evidence suggests that cholinergic signaling is both temporally and spatially constrained, in contrast to the traditional notion of slow, spatially homogeneous, and diffuse neuromodulation. Here, we find that spatially constrained cholinergic stimulation can generate theta-modulated gamma rhythms. Using biophysically-based excitatory-inhibitory (E-I) neural network models, we simulate the effects of ACh on neural excitability by varying the conductance of a muscarinic receptor-regulated K + current. In E-I networks with local excitatory connectivity and global inhibitory connectivity, we demonstrate that theta-gamma-coupled firing patterns emerge in ACh modulated network regions. Stable gamma-modulated firing arises within regions with high ACh signaling, while theta or mixed theta-gamma activity occurs at the peripheries of these regions. High gamma activity also alternates between different high-ACh regions, at theta frequency. Our results are the first to indicate a causal role for spatially heterogenous ACh signaling in the emergence of localized theta-gamma rhythmicity. Our findings also provide novel insights into mechanisms by which ACh signaling supports the brain region-specific attentional processing of sensory information.more » « less
An official website of the United States government
