Log In Sign Up

Noisy Softplus: an activation function that enables SNNs to be trained as ANNs

by   Qian Liu, et al.

We extended the work of proposed activation function, Noisy Softplus, to fit into training of layered up spiking neural networks (SNNs). Thus, any ANN employing Noisy Softplus neurons, even of deep architecture, can be trained simply by the traditional algorithm, for example Back Propagation (BP), and the trained weights can be directly used in the spiking version of the same network without any conversion. Furthermore, the training method can be generalised to other activation units, for instance Rectified Linear Units (ReLU), to train deep SNNs off-line. This research is crucial to provide an effective approach for SNN training, and to increase the classification accuracy of SNNs with biological characteristics and to close the gap between the performance of SNNs and ANNs.


page 1

page 2

page 3

page 4


Optimal ANN-SNN Conversion for Fast and Accurate Inference in Deep Spiking Neural Networks

Spiking Neural Networks (SNNs), as bio-inspired energy-efficient neural ...

SNN2ANN: A Fast and Memory-Efficient Training Framework for Spiking Neural Networks

Spiking neural networks are efficient computation models for low-power e...

A Free Lunch From ANN: Towards Efficient, Accurate Spiking Neural Networks Calibration

Spiking Neural Network (SNN) has been recognized as one of the next gene...

Classifying Images with Few Spikes per Neuron

Spiking neural networks (SNNs) promise to provide AI implementations wit...

Spiking Neurons with ASNN Based-Methods for the Neural Block Cipher

Problem statement: This paper examines Artificial Spiking Neural Network...

A Generalization Method of Partitioned Activation Function for Complex Number

A method to convert real number partitioned activation function into com...

A thermodynamically consistent chemical spiking neuron capable of autonomous Hebbian learning

We propose a fully autonomous, thermodynamically consistent set of chemi...