Fast and Efficient Information Transmission with Burst Spikes in Deep Spiking Neural Networks

09/10/2018
by   Seongsik Park, et al.
0

The spiking neural networks (SNNs), the 3rd generation of neural networks, are considered as one of the most promising artificial neural networks due to their energy-efficient computing capability. Despite their potential, the SNNs have a limited applicability owing to difficulties in training. Recently, conversion of a trained deep neural network (DNN) model to an SNN model has been extensively studied as an alternative approach. The result appears to be comparable to that of the DNN in image classification tasks. However, rate coding, one of the techniques used in modeling the SNNs, suffers from long latency due to its inability to transmit sufficient information to a subsequent neuron and this could have a catastrophic effect on a deeper SNN model. Another type of neural coding, called phase coding, also determines the amount of information being transmitted according to a global reference oscillator, and therefore, is inefficient in hidden layers where dynamics of neurons can change. In this paper, we propose a deep SNN model that can transmit information faster, and more efficiently between neurons by adopting a notion of burst spiking. Furthermore, we introduce a novel hybrid neural coding scheme that uses different neural coding schemes for different types of layers. Our experimental results for various image classification datasets, such as MNIST, CIFAR-10 and CIFAR-100, showed that the proposed methods can improve inference efficiency and shorten the latency while preserving high accuracy. Lastly, we validated the proposed methods through firing pattern analysis.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset