Meder, F.
(Ed.)
As neural networks have become increasingly prolific solutions to modern problems in science and engineering, there has been a congruent rise in the popularity of the numerical machine learning techniques used to design them. While numerical methods are highly generalizable, they also tend to produce unintuitive networks with inscrutable behavior. One solution to the problem of network interpretability is to use analytical design techniques, but these methods are relatively underdeveloped compared to their numerical alternatives. To increase the utilization of analytical techniques and eventually facilitate the symbiotic integration of both design strategies, it is necessary to improve the efficacy of analytical methods on fundamental function approximation tasks that can be used to perform more complex operations. Toward this end, this manuscript extends the design constraints of the addition and subtraction subnetworks of the functional subnetwork approach (FSA) to arbitrarily many inputs, and then derives new constraints for an alternative neural encoding/decoding scheme. This encoding/decoding scheme involves storing information in the activation ratio of a subnetwork’s neurons, rather than directly in their membrane voltages. We show that our new “relative” encoding/decoding scheme has both qualitative and quantitative advantages compared to the existing “absolute” encoding/decoding scheme, including helping to mitigate saturation and improving approximation accuracy. Our relative encoding scheme will be extended to other functional subnetworks in future work to assess its advantages on more complex operations.
more »
« less
An official website of the United States government

