An Efficient Approach to Boosting Performance of Deep Spiking Network Training

11/08/2016
by   Seongsik Park, et al.
Seoul National University
0

Nowadays deep learning is dominating the field of machine learning with state-of-the-art performance in various application areas. Recently, spiking neural networks (SNNs) have been attracting a great deal of attention, notably owning to their power efficiency, which can potentially allow us to implement a low-power deep learning engine suitable for real-time/mobile applications. However, implementing SNN-based deep learning remains challenging, especially gradient-based training of SNNs by error backpropagation. We cannot simply propagate errors through SNNs in conventional way because of the property of SNNs that process discrete data in the form of a series. Consequently, most of the previous studies employ a workaround technique, which first trains a conventional weighted-sum deep neural network and then maps the learning weights to the SNN under training, instead of training SNN parameters directly. In order to eliminate this workaround, recently proposed is a new class of SNN named deep spiking networks (DSNs), which can be trained directly (without a mapping from conventional deep networks) by error backpropagation with stochastic gradient descent. In this paper, we show that the initialization of the membrane potential on the backward path is an important step in DSN training, through diverse experiments performed under various conditions. Furthermore, we propose a simple and efficient method that can improve DSN training by controlling the initial membrane potential on the backward path. In our experiments, adopting the proposed approach allowed us to boost the performance of DSN training in terms of converging time and accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/17/2021

L4-Norm Weight Adjustments for Converted Spiking Neural Networks

Spiking Neural Networks (SNNs) are being explored for their potential en...
06/21/2022

Fluctuation-driven initialization for spiking neural network training

Spiking neural networks (SNNs) underlie low-power, fault-tolerant inform...
10/31/2022

A Faster Approach to Spiking Deep Convolutional Neural Networks

Spiking neural networks (SNNs) have closer dynamics to the brain than cu...
09/01/2020

Training Deep Neural Networks with Constrained Learning Parameters

Today's deep learning models are primarily trained on CPUs and GPUs. Alt...
08/29/2023

Gradient-based methods for spiking physical systems

Recent efforts have fostered significant progress towards deep learning ...

Please sign up or login with your details

Forgot password? Click here to reset