Recursive Least Squares for Training and Pruning Convolutional Neural Networks

01/13/2022
by   Tianzong Yu, et al.
0

Convolutional neural networks (CNNs) have succeeded in many practical applications. However, their high computation and storage requirements often make them difficult to deploy on resource-constrained devices. In order to tackle this issue, many pruning algorithms have been proposed for CNNs, but most of them can't prune CNNs to a reasonable level. In this paper, we propose a novel algorithm for training and pruning CNNs based on the recursive least squares (RLS) optimization. After training a CNN for some epochs, our algorithm combines inverse input autocorrelation matrices and weight matrices to evaluate and prune unimportant input channels or nodes layer by layer. Then, our algorithm will continue to train the pruned network, and won't do the next pruning until the pruned network recovers the full performance of the old network. Besides for CNNs, the proposed algorithm can be used for feedforward neural networks (FNNs). Three experiments on MNIST, CIFAR-10 and SVHN datasets show that our algorithm can achieve the more reasonable pruning and have higher learning efficiency than other four popular pruning algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2020

Layer Pruning via Fusible Residual Convolutional Block for Deep Neural Networks

In order to deploy deep convolutional neural networks (CNNs) on resource...
research
10/06/2020

RANP: Resource Aware Neuron Pruning at Initialization for 3D CNNs

Although 3D Convolutional Neural Networks (CNNs) are essential for most ...
research
10/01/2018

Layer-compensated Pruning for Resource-constrained Convolutional Neural Networks

Resource-efficient convolution neural networks enable not only the intel...
research
06/11/2019

A Taxonomy of Channel Pruning Signals in CNNs

Convolutional neural networks (CNNs) are widely used for classification ...
research
07/07/2020

Enabling On-Device CNN Training by Self-Supervised Instance Filtering and Error Map Pruning

This work aims to enable on-device training of convolutional neural netw...
research
04/28/2019

LeGR: Filter Pruning via Learned Global Ranking

Filter pruning has shown to be effective for learning resource-constrain...
research
05/29/2018

A novel channel pruning method for deep neural network compression

In recent years, deep neural networks have achieved great success in the...

Please sign up or login with your details

Forgot password? Click here to reset