- Award ID(s):
- 1734030
- PAR ID:
- 10169879
- Date Published:
- Journal Name:
- Neural Computation
- Volume:
- 32
- Issue:
- 6
- ISSN:
- 0899-7667
- Page Range / eLocation ID:
- 1033 to 1068
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Brain dynamics can exhibit narrow-band nonlinear oscillations and multistability. For a subset of disorders of consciousness and motor control, we hypothesized that some symptoms originate from the inability to spontaneously transition from one attractor to another. Using external perturbations, such as electrical pulses delivered by deep brain stimulation devices, it may be possible to induce such transition out of the pathological attractors. However, the induction of transition may be non-trivial, rendering the current open-loop stimulation strategies insufficient. In order to develop next-generation neural stimulators that can intelligently learn to induce attractor transitions, we require a platform to test the efficacy of such systems. To this end, we designed an analog circuit as a model for the multistable brain dynamics. The circuit spontaneously oscillates stably on two periods as an instantiation of a 3-dimensional continuous-time gated recurrent neural network. To discourage simple perturbation strategies, such as constant or random stimulation patterns from easily inducing transition between the stable limit cycles, we designed a state-dependent nonlinear circuit interface for external perturbation. We demonstrate the existence of nontrivial solutions to the transition problem in our circuit implementation.more » « less
-
Autism spectrum disorder is increasingly understood to be based on atypical signal transfer among multiple interconnected networks in the brain. Relative temporal patterns of neural activity have been shown to underlie both the altered neurophysiology and the altered behaviors in a variety of neurogenic disorders. We assessed brain network dynamics variability in autism spectrum disorders (ASD) using measures of synchronization (phase‐locking) strength, and timing of synchronization and desynchronization of neural activity (desynchronization ratio) across frequency bands of resting‐state electroencephalography (EEG). Our analysis indicated that frontoparietal synchronization is higher in ASD but with more short periods of desynchronization. It also indicates that the relationship between the properties of neural synchronization and behavior is different in ASD and typically developing populations. Recent theoretical studies suggest that neural networks with a high desynchronization ratio have increased sensitivity to inputs. Our results point to the potential significance of this phenomenon to the autistic brain. This sensitivity may disrupt the production of an appropriate neural and behavioral responses to external stimuli. Cognitive processes dependent on the integration of activity from multiple networks maybe, as a result, particularly vulnerable to disruption.
. © 2019 International Society for Autism Research, Wiley Periodicals, Inc.Autism Res 2020, 13: 24–31Lay Summary Parts of the brain can work together by synchronizing the activity of the neurons. We recorded the electrical activity of the brain in adolescents with autism spectrum disorder and then compared the recording to that of their peers without the diagnosis. We found that in participants with autism, there were a lot of very short time periods of non‐synchronized activity between frontal and parietal parts of the brain. Mathematical models show that the brain system with this kind of activity is very sensitive to external events.
-
Serre, Thomas (Ed.)Experience shapes our expectations and helps us learn the structure of the environment. Inference models render such learning as a gradual refinement of the observer’s estimate of the environmental prior. For instance, when retaining an estimate of an object’s features in working memory, learned priors may bias the estimate in the direction of common feature values. Humans display such biases when retaining color estimates on short time intervals. We propose that these systematic biases emerge from modulation of synaptic connectivity in a neural circuit based on the experienced stimulus history, shaping the persistent and collective neural activity that encodes the stimulus estimate. Resulting neural activity attractors are aligned to common stimulus values. Using recently published human response data from a delayed-estimation task in which stimuli (colors) were drawn from a heterogeneous distribution that did not necessarily correspond with reported population biases, we confirm that most subjects’ response distributions are better described by experience-dependent learning models than by models with fixed biases. This work suggests systematic limitations in working memory reflect efficient representations of inferred environmental structure, providing new insights into how humans integrate environmental knowledge into their cognitive strategies.more » « less
-
Neural activity underlying working memory is not a local phenomenon but distributed across multiple brain regions. To elucidate the circuit mechanism of such distributed activity, we developed an anatomically constrained computational model of large-scale macaque cortex. We found that mnemonic internal states may emerge from inter-areal reverberation, even in a regime where none of the isolated areas is capable of generating self-sustained activity. The mnemonic activity pattern along the cortical hierarchy indicates a transition in space, separating areas engaged in working memory and those which do not. A host of spatially distinct attractor states is found, potentially subserving various internal processes. The model yields testable predictions, including the idea of counterstream inhibitory bias, the role of prefrontal areas in controlling distributed attractors, and the resilience of distributed activity to lesions or inactivation. This work provides a theoretical framework for identifying large-scale brain mechanisms and computational principles of distributed cognitive processes.more » « less
-
We present a formal, mathematical foundation for modeling and reasoning about the behavior of synchronous, stochastic Spiking Neural Networks (SNNs), which have been widely used in studies of neural computation. Our approach follows paradigms established in the field of concurrency theory. Our SNN model is based on directed graphs of neurons, classified as input, output, and internal neurons. We focus here on basic SNNs, in which a neuron’s only state is a Boolean value indicating whether or not the neuron is currently firing. We also define the external behavior of an SNN, in terms of probability distributions on its external firing patterns. We define two operators on SNNs: a composition operator, which supports modeling of SNNs as combinations of smaller SNNs, and a hiding operator, which reclassifies some output behavior of an SNN as internal. We prove results showing how the external behavior of a network built using these operators is related to the external behavior of its component networks. Finally, we definition the notion of a problem to be solved by an SNN, and show how the composition and hiding operators affect the problems that are solved by the networks. We illustrate our definitions with three examples: a Boolean circuit constructed from gates, an Attention network constructed from a Winner-Take-All network and a Filter network, and a toy example involving combining two networks in a cyclic fashion.more » « less