Finding the Optimal Network Depth in Classification Tasks

04/17/2020
by   Bartosz Wójcik, et al.
51

We develop a fast end-to-end method for training lightweight neural networks using multiple classifier heads. By allowing the model to determine the importance of each head and rewarding the choice of a single shallow classifier, we are able to detect and remove unneeded components of the network. This operation, which can be seen as finding the optimal depth of the model, significantly reduces the number of parameters and accelerates inference across different hardware processing units, which is not the case for many standard pruning methods. We show the performance of our method on multiple network architectures and datasets, analyze its optimization properties, and conduct ablation studies.

READ FULL TEXT

page 7

page 9

page 12

page 14

research
08/13/2023

Neural Networks at a Fraction with Pruned Quaternions

Contemporary state-of-the-art neural networks have increasingly large nu...
research
07/01/2020

Single Shot Structured Pruning Before Training

We introduce a method to speed up training by 2x and inference by 3x in ...
research
11/17/2020

Multigrid-in-Channels Neural Network Architectures

We present a multigrid-in-channels (MGIC) approach that tackles the quad...
research
10/30/2019

Lightweight and Efficient End-to-End Speech Recognition Using Low-Rank Transformer

High performing deep neural networks come at the cost of computational c...
research
05/16/2016

Reducing the Model Order of Deep Neural Networks Using Information Theory

Deep neural networks are typically represented by a much larger number o...
research
01/14/2018

Fix your classifier: the marginal value of training the last weight layer

Neural networks are commonly used as models for classification for a wid...
research
11/30/2022

Average Path Length: Sparsification of Nonlinearties Creates Surprisingly Shallow Networks

We perform an empirical study of the behaviour of deep networks when pus...

Please sign up or login with your details

Forgot password? Click here to reset