Reducing the Model Order of Deep Neural Networks Using Information Theory

05/16/2016
by   Ming Tu, et al.
0

Deep neural networks are typically represented by a much larger number of parameters than shallow models, making them prohibitive for small footprint devices. Recent research shows that there is considerable redundancy in the parameter space of deep neural networks. In this paper, we propose a method to compress deep neural networks by using the Fisher Information metric, which we estimate through a stochastic optimization method that keeps track of second-order information in the network. We first remove unimportant parameters and then use non-uniform fixed point quantization to assign more bits to parameters with higher Fisher Information estimates. We evaluate our method on a classification task with a convolutional neural network trained on the MNIST data set. Experimental results show that our method outperforms existing methods for both network pruning and quantization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2021

Training Deep Neural Networks with Joint Quantization and Pruning of Weights and Activations

Quantization and pruning are core techniques used to reduce the inferenc...
research
03/03/2016

Convolutional Neural Networks using Logarithmic Data Representation

Recent advances in convolutional neural networks have considered model c...
research
11/01/2019

Memory Requirement Reduction of Deep Neural Networks Using Low-bit Quantization of Parameters

Effective employment of deep neural networks (DNNs) in mobile devices an...
research
04/06/2021

Binary Neural Network for Speaker Verification

Although deep neural networks are successful for many tasks in the speec...
research
04/17/2020

Finding the Optimal Network Depth in Classification Tasks

We develop a fast end-to-end method for training lightweight neural netw...
research
01/17/2023

Deep Conditional Measure Quantization

The quantization of a (probability) measure is replacing it by a sum of ...

Please sign up or login with your details

Forgot password? Click here to reset