Structured Deep Neural Network Pruning by Varying Regularization Parameters

04/25/2018
by   Huan Wang, et al.
0

Convolutional Neural Networks (CNN's) are restricted by their massive computation and high storage. Parameter pruning is a promising approach for CNN compression and acceleration, which aims at eliminating redundant model parameters with tolerable performance loss. Despite its effectiveness, existing regularization-based parameter pruning methods usually assign a fixed regularization parameter to all weights, which neglects the fact that different weights may have different importance to CNN. To solve this problem, we propose a theoretically sound regularization-based pruning method to incrementally assign different regularization parameters to different weights based on their importance to the network. On AlexNet and VGG-16, our method can achieve 4x theoretical speedup with similar accuracies compared with the baselines. For ResNet-50, the proposed method also achieves 2x acceleration and only suffers 0.1

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/20/2018

Structured Pruning for Efficient ConvNets via Incremental Regularization

Parameter pruning is a promising approach for CNN compression and accele...
research
11/19/2018

Three Dimensional Convolutional Neural Network Pruning with Regularization-Based Method

In recent years, three-dimensional convolutional neural network (3D CNN)...
research
09/20/2017

Structured Probabilistic Pruning for Convolutional Neural Network Acceleration

Although deep Convolutional Neural Network (CNN) has shown better perfor...
research
09/22/2021

High-dimensional Bayesian Optimization for CNN Auto Pruning with Clustering and Rollback

Pruning has been widely used to slim convolutional neural network (CNN) ...
research
04/11/2022

Regularization-based Pruning of Irrelevant Weights in Deep Neural Architectures

Deep neural networks exploiting millions of parameters are nowadays the ...
research
05/08/2020

Pruning Algorithms to Accelerate Convolutional Neural Networks for Edge Applications: A Survey

With the general trend of increasing Convolutional Neural Network (CNN) ...
research
04/09/2020

Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks

In a deep neural network (DNN), the number of the parameters is usually ...

Please sign up or login with your details

Forgot password? Click here to reset