Fast ConvNets Using Group-wise Brain Damage

06/08/2015
by   Vadim Lebedev, et al.
0

We revisit the idea of brain damage, i.e. the pruning of the coefficients of a neural network, and suggest how brain damage can be modified and used to speedup convolutional layers. The approach uses the fact that many efficient implementations reduce generalized convolutions to matrix multiplications. The suggested brain damage process prunes the convolutional kernel tensor in a group-wise fashion by adding group-sparsity regularization to the standard training process. After such group-wise pruning, convolutions can be reduced to multiplications of thinned dense matrices, which leads to speedup. In the comparison on AlexNet, the method achieves very competitive performance.

READ FULL TEXT

page 3

page 8

research
11/02/2022

SIMD-size aware weight regularization for fast neural vocoding on CPU

This paper proposes weight regularization for a faster neural vocoder. P...
research
08/29/2020

Accelerating Sparse DNN Models without Hardware-Support via Tile-Wise Sparsity

Network pruning can reduce the high computation cost of deep neural netw...
research
08/28/2021

Layer-wise Model Pruning based on Mutual Information

The proposed pruning strategy offers merits over weight-based pruning te...
research
12/29/2015

Structured Pruning of Deep Convolutional Neural Networks

Real time application of deep learning algorithms is often hindered by h...
research
09/16/2021

Dense Pruning of Pointwise Convolutions in the Frequency Domain

Depthwise separable convolutions and frequency-domain convolutions are t...
research
11/13/2019

Selective Brain Damage: Measuring the Disparate Impact of Model Pruning

Neural network pruning techniques have demonstrated it is possible to re...
research
01/13/2020

Predicting population neural activity in the Algonauts challenge using end-to-end trained Siamese networks and group convolutions

The Algonauts challenge is about predicting the object representations i...

Please sign up or login with your details

Forgot password? Click here to reset