Pruning of Convolutional Neural Networks Using Ising Energy Model

02/10/2021
by   Hojjat Salehinejad, et al.
0

Pruning is one of the major methods to compress deep neural networks. In this paper, we propose an Ising energy model within an optimization framework for pruning convolutional kernels and hidden units. This model is designed to reduce redundancy between weight kernels and detect inactive kernels/hidden units. Our experiments using ResNets, AlexNet, and SqueezeNet on CIFAR-10 and CIFAR-100 datasets show that the proposed method on average can achieve a pruning rate of more than 50% of the trainable parameters with approximately <10% and <5% drop of Top-1 and Top-5 classification accuracy, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2021

A Framework For Pruning Deep Neural Networks Using Energy-Based Models

A typical deep neural network (DNN) has a large number of trainable para...
research
07/09/2021

Structured Model Pruning of Convolutional Networks on Tensor Processing Units

The deployment of convolutional neural networks is often hindered by hig...
research
06/29/2019

Dissecting Pruned Neural Networks

Pruning is a standard technique for removing unnecessary structure from ...
research
01/22/2021

Baseline Pruning-Based Approach to Trojan Detection in Neural Networks

This paper addresses the problem of detecting trojans in neural networks...
research
07/01/2020

Single Shot Structured Pruning Before Training

We introduce a method to speed up training by 2x and inference by 3x in ...
research
05/03/2022

Compact Neural Networks via Stacking Designed Basic Units

Unstructured pruning has the limitation of dealing with the sparse and i...
research
06/15/2018

Detecting Dead Weights and Units in Neural Networks

Deep Neural Networks are highly over-parameterized and the size of the n...

Please sign up or login with your details

Forgot password? Click here to reset