RANP: Resource Aware Neuron Pruning at Initialization for 3D CNNs

10/06/2020
by   Zhiwei Xu, et al.
14

Although 3D Convolutional Neural Networks (CNNs) are essential for most learning based applications involving dense 3D data, their applicability is limited due to excessive memory and computational requirements. Compressing such networks by pruning therefore becomes highly desirable. However, pruning 3D CNNs is largely unexplored possibly because of the complex nature of typical pruning algorithms that embeds pruning into an iterative optimization paradigm. In this work, we introduce a Resource Aware Neuron Pruning (RANP) algorithm that prunes 3D CNNs at initialization to high sparsity levels. Specifically, the core idea is to obtain an importance score for each neuron based on their sensitivity to the loss function. This neuron importance is then reweighted according to the neuron resource consumption related to FLOPs or memory. We demonstrate the effectiveness of our pruning method on 3D semantic segmentation with widely used 3D-UNets on ShapeNet and BraTS'18 as well as on video classification with MobileNetV2 and I3D on UCF101 dataset. In these experiments, our RANP leads to roughly 50-95 reduction in FLOPs and 35-80 reduction in memory with negligible loss in accuracy compared to the unpruned networks. This significantly reduces the computational resources required to train 3D CNNs. The pruned network obtained by our algorithm can also be easily scaled up and transferred to another dataset for training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2022

Recursive Least Squares for Training and Pruning Convolutional Neural Networks

Convolutional neural networks (CNNs) have succeeded in many practical ap...
research
08/07/2022

N2NSkip: Learning Highly Sparse Networks using Neuron-to-Neuron Skip Connections

The over-parametrized nature of Deep Neural Networks leads to considerab...
research
11/16/2017

NISP: Pruning Networks using Neuron Importance Score Propagation

To reduce the significant redundancy in deep Convolutional Neural Networ...
research
06/11/2019

A Taxonomy of Channel Pruning Signals in CNNs

Convolutional neural networks (CNNs) are widely used for classification ...
research
07/25/2018

Crossbar-aware neural network pruning

Crossbar architecture based devices have been widely adopted in neural n...
research
09/08/2020

CNNPruner: Pruning Convolutional Neural Networks with Visual Analytics

Convolutional neural networks (CNNs) have demonstrated extraordinarily g...
research
03/17/2021

CAP: Context-Aware Pruning for Semantic Segmentation

Network pruning for deep convolutional neural networks (CNNs) has recent...

Please sign up or login with your details

Forgot password? Click here to reset