Nonlinear computations in spiking neural networks through multiplicative synapses

09/08/2020
by   Michele Nardin, et al.
0

The brain performs many nonlinear computations through intricate spiking neural networks (SNNs). How neural network dynamics relate to arbitrary computations under these constraints is still an open question. As a strong constraint, these networks are hypothesized to be robust to perturbations and use minimal energy. The theory of Spike Coding Networks (SCNs) derives the required connectivity and dynamics for both information representation and linear dynamical systems from first principles, and achieves robustness and efficiency. Nonlinear dynamical systems have thus far only been implemented in SCNs by filtering neural inputs through sets of nonlinear dendritic basis functions. While this approach works well, it relies on providing a rich enough basis set as well as supervised training of the connectivity weights. Another way to implement nonlinear computations is through multiplicatively interacting synapses. However, there is currently no principled way to implement such synapses in SCNs. Here, we extend the core SCN derivations to implement polynomial dynamical systems, from which also the need for such multiplicatively interacting synapses arises. We demonstrate our approach with a highly accurate Lorenz attractor implementation, as well as a second-order approximation of a double pendulum. We additionally demonstrate how to implement higher-order polynomials using sequential networks with only pair-wise synapses. Finally, we derive upper bounds and expected numbers of connections based on the sparsity of the underlying representation. Overall, our work provides an alternative way to directly implement nonlinear computations in spike coding networks, and expands our understanding about the potential functions of multiplicative synapses. Furthermore, due to the high accuracy and low energy usage of our approach, this work may be of interest for neuromorphic computing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2020

EqSpike: Spike-driven Equilibrium Propagation for Neuromorphic Implementations

Neuromorphic systems achieve high energy efficiency by computing with sp...
research
12/25/2022

Closed-form control with spike coding networks

Efficient and robust control using spiking neural networks (SNNs) is sti...
research
05/26/2022

Learning in Feedback-driven Recurrent Spiking Neural Networks using full-FORCE Training

Feedback-driven recurrent spiking neural networks (RSNNs) are powerful c...
research
06/27/2016

Supervised learning based on temporal coding in spiking neural networks

Gradient descent training techniques are remarkably successful in traini...
research
01/31/2023

Spyker: High-performance Library for Spiking Deep Neural Networks

Spiking neural networks (SNNs) have been recently brought to light due t...
research
12/09/2021

Advancing Deep Residual Learning by Solving the Crux of Degradation in Spiking Neural Networks

Despite the rapid progress of neuromorphic computing, the inadequate dep...
research
06/21/2021

Learn Like The Pro: Norms from Theory to Size Neural Computation

The optimal design of neural networks is a critical problem in many appl...

Please sign up or login with your details

Forgot password? Click here to reset