CUP: Cluster Pruning for Compressing Deep Neural Networks

11/19/2019
by   Rahul Duggal, et al.
0

We propose Cluster Pruning (CUP) for compressing and accelerating deep neural networks. Our approach prunes similar filters by clustering them based on features derived from both the incoming and outgoing weight connections. With CUP, we overcome two limitations of prior work-(1) non-uniform pruning: CUP can efficiently determine the ideal number of filters to prune in each layer of a neural network. This is in contrast to prior methods that either prune all layers uniformly or otherwise use resource-intensive methods such as manual sensitivity analysis or reinforcement learning to determine the ideal number. (2) Single-shot operation: We extend CUP to CUP-SS (for CUP single shot) whereby pruning is integrated into the initial training phase itself. This leads to large savings in training time compared to traditional pruning pipelines. Through extensive evaluation on multiple datasets (MNIST, CIFAR-10, and Imagenet) and models(VGG-16, Resnets-18/34/56) we show that CUP outperforms recent state of the art. Specifically, CUP-SS achieves 2.2x flops reduction for a Resnet-50 model trained on Imagenet while staying within 0.9 It saves over 14 hours in training time with respect to the original Resnet-50. The code to reproduce results is available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2020

OrthoReg: Robust Network Pruning Using Orthonormality Regularization

Network pruning in Convolutional Neural Networks (CNNs) has been extensi...
research
11/24/2021

Accelerating Deep Learning with Dynamic Data Pruning

Deep learning's success has been attributed to the training of large, ov...
research
10/01/2018

Layer-compensated Pruning for Resource-constrained Convolutional Neural Networks

Resource-efficient convolution neural networks enable not only the intel...
research
12/15/2018

A Low Effort Approach to Structured CNN Design Using PCA

Deep learning models hold state of the art performance in many fields, y...
research
04/06/2023

NTK-SAP: Improving neural network pruning by aligning training dynamics

Pruning neural networks before training has received increasing interest...
research
06/22/2020

Slimming Neural Networks using Adaptive Connectivity Scores

There are two broad approaches to deep neural network (DNN) pruning: 1) ...
research
11/30/2019

Pruning at a Glance: Global Neural Pruning for Model Compression

Deep Learning models have become the dominant approach in several areas ...

Please sign up or login with your details

Forgot password? Click here to reset