Pruning Early Exit Networks

07/08/2022
by   Alperen Gormez, et al.
0

Deep learning models that perform well often have high computational costs. In this paper, we combine two approaches that try to reduce the computational cost while keeping the model performance high: pruning and early exit networks. We evaluate two approaches of pruning early exit networks: (1) pruning the entire network at once, (2) pruning the base network and additional linear classifiers in an ordered fashion. Experimental results show that pruning the entire network at once is a better strategy in general. However, at high accuracy rates, the two approaches have a similar performance, which implies that the processes of pruning and early exit can be separated without loss of optimality.

READ FULL TEXT
research
10/22/2021

When to Prune? A Policy towards Early Structural Pruning

Pruning enables appealing reductions in network memory footprint and tim...
research
03/05/2020

Pruning Filters while Training for Efficiently Optimizing Deep Learning Networks

Modern deep networks have millions to billions of parameters, which lead...
research
04/30/2020

Pruning artificial neural networks: a way to find well-generalizing, high-entropy sharp minima

Recently, a race towards the simplification of deep networks has begun, ...
research
03/12/2018

It was the training data pruning too!

We study the current best model (KDG) for question answering on tabular ...
research
06/22/2020

Revisiting Loss Modelling for Unstructured Pruning

By removing parameters from deep neural networks, unstructured pruning m...
research
02/15/2022

Convolutional Network Fabric Pruning With Label Noise

This paper presents an iterative pruning strategy for Convolutional Netw...
research
06/21/2022

Winning the Lottery Ahead of Time: Efficient Early Network Pruning

Pruning, the task of sparsifying deep neural networks, received increasi...

Please sign up or login with your details

Forgot password? Click here to reset