Distilling Spikes: Knowledge Distillation in Spiking Neural Networks

05/01/2020
by   Ravi Kumar Kushawaha, et al.
12

Spiking Neural Networks (SNN) are energy-efficient computing architectures that exchange spikes for processing information, unlike classical Artificial Neural Networks (ANN). Due to this, SNNs are better suited for real-life deployments. However, similar to ANNs, SNNs also benefit from deeper architectures to obtain improved performance. Furthermore, like the deep ANNs, the memory, compute and power requirements of SNNs also increase with model size, and model compression becomes a necessity. Knowledge distillation is a model compression technique that enables transferring the learning of a large machine learning model to a smaller model with minimal loss in performance. In this paper, we propose techniques for knowledge distillation in spiking neural networks for the task of image classification. We present ways to distill spikes from a larger SNN, also called the teacher network, to a smaller one, also called the student network, while minimally impacting the classification accuracy. We demonstrate the effectiveness of the proposed method with detailed experiments on three standard datasets while proposing novel distillation methodologies and loss functions. We also present a multi-stage knowledge distillation technique for SNNs using an intermediate network to obtain higher performance from the student network. Our approach is expected to open up new avenues for deploying high performing large SNN models on resource-constrained hardware platforms.

READ FULL TEXT
research
06/14/2021

Energy-efficient Knowledge Distillation for Spiking Neural Networks

Spiking neural networks (SNNs) have been gaining interest as energy-effi...
research
04/17/2023

LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks

Spiking Neural Networks (SNNs) are biologically realistic and practicall...
research
10/24/2018

HAKD: Hardware Aware Knowledge Distillation

Despite recent developments, deploying deep neural networks on resource ...
research
04/12/2023

Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation

Spiking neural networks (SNNs) are well known as the brain-inspired mode...
research
08/29/2023

SpikeBERT: A Language Spikformer Trained with Two-Stage Knowledge Distillation from BERT

Spiking neural networks (SNNs) offer a promising avenue to implement dee...
research
10/08/2015

Distilling Model Knowledge

Top-performing machine learning systems, such as deep neural networks, l...
research
12/07/2020

Model Compression Using Optimal Transport

Model compression methods are important to allow for easier deployment o...

Please sign up or login with your details

Forgot password? Click here to reset