Centripetal SGD for Pruning Very Deep Convolutional Networks with Complicated Structure

04/08/2019
by   Xiaohan Ding, et al.
0

The redundancy is widely recognized in Convolutional Neural Networks (CNNs), which enables to remove unimportant filters from convolutional layers so as to slim the network with acceptable performance drop. Inspired by the linear and combinational properties of convolution, we seek to make some filters increasingly close and eventually identical for network slimming. To this end, we propose Centripetal SGD (C-SGD), a novel optimization method, which can train several filters to collapse into a single point in the parameter hyperspace. When the training is completed, the removal of the identical filters can trim the network with NO performance loss, thus no finetuning is needed. By doing so, we have partly solved an open problem of constrained filter pruning on CNNs with complicated structure, where some layers must be pruned following others. Our experimental results on CIFAR-10 and ImageNet have justified the effectiveness of C-SGD-based filter pruning. Moreover, we have provided empirical evidences for the assumption that the redundancy in deep neural networks helps the convergence of training by showing that a redundant CNN trained using C-SGD outperforms a normally trained counterpart with the equivalent width.

READ FULL TEXT
research
07/30/2021

Manipulating Identical Filter Redundancy for Efficient Pruning on Deep and Complicated CNN

The existence of redundancy in Convolutional Neural Networks (CNNs) enab...
research
05/12/2019

Approximated Oracle Filter Pruning for Destructive CNN Width Optimization

It is not easy to design and run Convolutional Neural Networks (CNNs) du...
research
01/22/2017

Optimization on Product Submanifolds of Convolution Kernels

Recent advances in optimization methods used for training convolutional ...
research
07/07/2020

Lossless CNN Channel Pruning via Gradient Resetting and Convolutional Re-parameterization

Channel pruning (a.k.a. filter pruning) aims to slim down a convolutiona...
research
05/26/2015

Accelerating Very Deep Convolutional Networks for Classification and Detection

This paper aims to accelerate the test-time computation of convolutional...
research
01/31/2018

Recovering from Random Pruning: On the Plasticity of Deep Convolutional Neural Networks

Recently there has been a lot of work on pruning filters from deep convo...
research
11/13/2020

LEAN: graph-based pruning for convolutional neural networks by extracting longest chains

Convolutional neural networks (CNNs) have proven to be highly successful...

Please sign up or login with your details

Forgot password? Click here to reset