A Little Energy Goes a Long Way: Energy-Efficient, Accurate Conversion from Convolutional Neural Networks to Spiking Neural Networks

03/01/2021
by   Dengyu Wu, et al.
0

Spiking neural networks (SNNs) offer an inherent ability to process spatial-temporal data, or in other words, realworld sensory data, but suffer from the difficulty of training high accuracy models. A major thread of research on SNNs is on converting a pre-trained convolutional neural network (CNN) to an SNN of the same structure. State-of-the-art conversion methods are approaching the accuracy limit, i.e., the near-zero accuracy loss of SNN against the original CNN. However, we note that this is made possible only when significantly more energy is consumed to process an input. In this paper, we argue that this trend of ”energy for accuracy” is not necessary – a little energy can go a long way to achieve the near-zero accuracy loss. Specifically, we propose a novel CNN-to-SNN conversion method that is able to use a reasonably short spike train (e.g., 256 timesteps for CIFAR10 images) to achieve the near-zero accuracy loss. The new conversion method, named as explicit current control (ECC), contains three techniques (current normalisation, thresholding for residual elimination, and consistency maintenance for batch-normalisation), in order to explicitly control the currents flowing through the SNN when processing inputs. We implement ECC into a tool nicknamed SpKeras, which can conveniently import Keras CNN models and convert them into SNNs. We conduct an extensive set of experiments with the tool – working with VGG16 and various datasets such as CIFAR10 and CIFAR100 – and compare with state-of-the-art conversion methods. Results show that ECC is a promising method that can optimise over energy consumption and accuracy loss simultaneously.

READ FULL TEXT

page 2

page 7

research
12/03/2019

Optimizing the energy consumption of spiking neural networks for neuromorphic applications

In the last few years, spiking neural networks have been demonstrated to...
research
08/09/2022

A Time-to-first-spike Coding and Conversion Aware Training for Energy-Efficient Deep Spiking Neural Network Processor Design

In this paper, we present an energy-efficient SNN architecture, which ca...
research
05/25/2021

Optimal ANN-SNN Conversion for Fast and Accurate Inference in Deep Spiking Neural Networks

Spiking Neural Networks (SNNs), as bio-inspired energy-efficient neural ...
research
12/13/2016

Theory and Tools for the Conversion of Analog to Spiking Convolutional Neural Networks

Deep convolutional neural networks (CNNs) have shown great potential for...
research
06/22/2020

On the Ability of a CNN to Realize Image-to-Image Language Conversion

The purpose of this paper is to reveal the ability that Convolutional Ne...
research
04/07/2021

PrivateSNN: Fully Privacy-Preserving Spiking Neural Networks

How can we bring both privacy and energy-efficiency to a neural system o...

Please sign up or login with your details

Forgot password? Click here to reset