HAKD: Hardware Aware Knowledge Distillation

10/24/2018
by   Jack Turner, et al.
10

Despite recent developments, deploying deep neural networks on resource constrained general purpose hardware remains a significant challenge. There has been much work in developing methods for reshaping neural networks, usually with a focus on minimising total parameter count. These methods are typically developed in a hardware-agnostic manner and do not exploit hardware behaviour. In this paper we propose a new approach, Hardware Aware Knowledge Distillation (HAKD) which uses empirical observations of hardware behaviour to design efficient student networks which are then trained with knowledge distillation. This allows the trade-off between accuracy and performance to be managed explicitly. We have applied this approach across three platforms and evaluated it on two networks, MobileNet and DenseNet, on CIFAR-10. We show that HAKD outperforms Deep Compression and Fisher pruning in terms of size, accuracy and performance.

READ FULL TEXT
research
05/01/2020

Distilling Spikes: Knowledge Distillation in Spiking Neural Networks

Spiking Neural Networks (SNN) are energy-efficient computing architectur...
research
08/01/2018

SlimNets: An Exploration of Deep Model Compression and Acceleration

Deep neural networks have achieved increasingly accurate results on a wi...
research
10/24/2018

Distilling with Performance Enhanced Students

The task of accelerating large neural networks on general purpose hardwa...
research
09/09/2020

On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective

To put a state-of-the-art neural network to practical use, it is necessa...
research
06/14/2021

Energy-efficient Knowledge Distillation for Spiking Neural Networks

Spiking neural networks (SNNs) have been gaining interest as energy-effi...
research
02/23/2022

Are All Linear Regions Created Equal?

The number of linear regions has been studied as a proxy of complexity f...
research
10/08/2015

Distilling Model Knowledge

Top-performing machine learning systems, such as deep neural networks, l...

Please sign up or login with your details

Forgot password? Click here to reset