As neural networks have become increasingly prolific solutions to modern problems in science and engineering, there has been a congruent rise in the popularity of the numerical machine learning techniques used to design them. While numerical methods are highly generalizable, they also tend to produce unintuitive networks with inscrutable behavior. One solution to the problem of network interpretability is to use analytical design techniques, but these methods are relatively underdeveloped compared to their numerical alternatives. To increase the utilization of analytical techniques and eventually facilitate the symbiotic integration of both design strategies, it is necessary to improve the efficacy of analytical methods on fundamental function approximation tasks that can be used to perform more complex operations. Toward this end, this manuscript extends the design constraints of the addition and subtraction subnetworks of the functional subnetwork approach (FSA) to arbitrarily many inputs, and then derives new constraints for an alternative neural encoding/decoding scheme. This encoding/decoding scheme involves storing information in the activation ratio of a subnetwork’s neurons, rather than directly in their membrane voltages. We show that our new “relative” encoding/decoding scheme has both qualitative and quantitative advantages compared to the existing “absolute” encoding/decoding scheme, including helping to mitigate saturation and improving approximation accuracy. Our relative encoding scheme will be extended to other functional subnetworks in future work to assess its advantages on more complex operations.
more »
« less
A Functional Subnetwork Approach to Multistate Central Pattern Generator Phase Difference Control
Central pattern generators (CPGs) are ubiquitous neural circuits that contribute to an eclectic collection of rhythmic behaviors across an equally diverse assortment of animal species. Due to their prominent role in many neuromechanical phenomena, numerous bioinspired robots have been designed to both investigate and exploit the operation of these neural oscillators. In order to serve as effective tools for these robotics applications, however, it is often necessary to be able to adjust the phase alignment of multiple CPGs during operation. To achieve this goal, we present the design of our phase difference control (PDC) network using a functional subnetwork approach (FSA) wherein subnetworks that perform basic mathematical operations are assembled such that they serve to control the relative phase lead/lag of target CPGs. Our PDC network operates by first estimating the phase difference between two CPGs, then comparing this phase difference to a reference signal that encodes the desired phase difference, and finally eliminating any error by emulating a proportional controller that adjusts the CPG oscillation frequencies. The architecture of our PDC network, as well as its various parameters, are all determined via analytical design rules that allow for direct interpretability of the network behavior. Simulation results for both the complete PDC network and a selection of its various functional subnetworks are provided to demonstrate the efficacy of our methodology.
more »
« less
- Award ID(s):
- 2015317
- PAR ID:
- 10424820
- Date Published:
- Journal Name:
- Biomimetic and Biohybrid Systems
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Meder, F. (Ed.)As neural networks have become increasingly prolific solutions to modern problems in science and engineering, there has been a congruent rise in the popularity of the numerical machine learning techniques used to design them. While numerical methods are highly generalizable, they also tend to produce unintuitive networks with inscrutable behavior. One solution to the problem of network interpretability is to use analytical design techniques, but these methods are relatively underdeveloped compared to their numerical alternatives. To increase the utilization of analytical techniques and eventually facilitate the symbiotic integration of both design strategies, it is necessary to improve the efficacy of analytical methods on fundamental function approximation tasks that can be used to perform more complex operations. Toward this end, this manuscript extends the design constraints of the addition and subtraction subnetworks of the functional subnetwork approach (FSA) to arbitrarily many inputs, and then derives new constraints for an alternative neural encoding/decoding scheme. This encoding/decoding scheme involves storing information in the activation ratio of a subnetwork’s neurons, rather than directly in their membrane voltages. We show that our new “relative” encoding/decoding scheme has both qualitative and quantitative advantages compared to the existing “absolute” encoding/decoding scheme, including helping to mitigate saturation and improving approximation accuracy. Our relative encoding scheme will be extended to other functional subnetworks in future work to assess its advantages on more complex operations.more » « less
-
null (Ed.)Abstract Creative cognition has been consistently associated with functional connectivity between frontoparietal control and default networks. However, recent research identified distinct connectivity dynamics for subnetworks within the larger frontoparietal system—one subnetwork (FPCNa) shows positive coupling with the default network and another subnetwork (FPCNb) shows negative default coupling—raising questions about how these networks interact during creative cognition. Here we examine frontoparietal subnetwork functional connectivity in a large sample of participants (n = 171) who completed a divergent creative thinking task and a resting-state scan during fMRI. We replicated recent findings on functional connectivity of frontoparietal subnetworks at rest: FPCNa positively correlated with the default network and FPCNb negatively correlated with the default network. Critically, we found that divergent thinking evoked functional connectivity between both frontoparietal subnetworks and the default network, but in different ways. Using community detection, we found that FPCNa regions showed greater coassignment to a default network community. However, FPCNb showed overall stronger functional connectivity with the default network—reflecting a reversal of negative connectivity at rest—and the strength of FPCNb-default network connectivity correlated with individual creative ability. These findings provide novel evidence of a behavioral benefit to the cooperation of typically anticorrelated brain networks.more » « less
-
Kurtz, Jurgen (Ed.)In neuroscience, delayed synaptic activity plays a pivotal and pervasive role in influencing synchronization, oscillation, and information-processing properties of neural networks. In small rhythm-generating networks, such as central pattern generators (CPGs), time-delays may regulate and determine the stability and variability of rhythmic activity, enabling organisms to adapt to environmental changes, and coordinate diverse locomotion patterns in both function and dysfunction. Here, we examine the dynamics of a three-cell CPG model in which time-delays are introduced into reciprocally inhibitory synapses between constituent neurons. We employ computational analysis to investigate the multiplicity and robustness of various rhythms observed in such multi-modal neural networks. Our approach involves deriving exhaustive two-dimensional Poincaré return maps for phase-lags between constituent neurons, where stable fixed points and invariant curves correspond to various phase-locked and phase-slipping/jitter rhythms. These rhythms emerge and disappear through various local (saddle-node, torus) and non-local (homoclinic) bifurcations, highlighting the multi-functionality (modality) observed in such small neural networks with fast inhibitory synapses.more » « less
-
Oh, A; Naumann, T; Globerson, A; Saenko, K; Hardt, M; Levine, S (Ed.)Current deep-learning models for object recognition are known to be heavily biased toward texture. In contrast, human visual systems are known to be biased toward shape and structure. What could be the design principles in human visual systems that led to this difference? How could we introduce more shape bias into the deep learning models? In this paper, we report that sparse coding, a ubiquitous principle in the brain, can in itself introduce shape bias into the network. We found that enforcing the sparse coding constraint using a non-differential Top-K operation can lead to the emergence of structural encoding in neurons in convolutional neural networks, resulting in a smooth decomposition of objects into parts and subparts and endowing the networks with shape bias. We demonstrated this emergence of shape bias and its functional benefits for different network structures with various datasets. For object recognition convolutional neural networks, the shape bias leads to greater robustness against style and pattern change distraction. For the image synthesis generative adversary networks, the emerged shape bias leads to more coherent and decomposable structures in the synthesized images. Ablation studies suggest that sparse codes tend to encode structures, whereas the more distributed codes tend to favor texture. Our code is host at the github repository: https://topk-shape-bias.github.io/more » « less
An official website of the United States government

