Data-Efficient Structured Pruning via Submodular Optimization

03/09/2022
by   Marwa El Halabi, et al.
8

Structured pruning is an effective approach for compressing large pre-trained neural networks without significantly affecting their performance, which involves removing redundant regular regions of weights. However, current structured pruning methods are highly empirical in nature, do not provide any theoretical guarantees, and often require fine-tuning, which makes them inapplicable in the limited-data regime. We propose a principled data-efficient structured pruning method based on submodular optimization. In particular, for a given layer, we select neurons/channels to prune and corresponding new weights for the next layer, that minimize the change in the next layer's input induced by pruning. We show that this selection problem is a weakly submodular maximization problem, thus it can be provably approximated using an efficient greedy algorithm. Our method is one of the few in the literature that uses only a limited-number of training data and no labels. Our experimental results demonstrate that our method outperforms popular baseline methods in various one-shot pruning settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2022

Pruning Pre-trained Language Models Without Fine-Tuning

To overcome the overparameterized problem in Pre-trained Language Models...
research
06/02/2020

Shapley Value as Principled Metric for Structured Network Pruning

Structured pruning is a well-known technique to reduce the storage size ...
research
10/01/2021

Learning Compact Representations of Neural Networks using DiscriminAtive Masking (DAM)

A central goal in deep learning is to learn compact representations of f...
research
03/03/2020

Good Subnetworks Provably Exist: Pruning via Greedy Forward Selection

Recent empirical works show that large deep neural networks are often hi...
research
10/11/2019

SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks

We introduce a pruning algorithm that provably sparsifies the parameters...
research
04/13/2022

Receding Neuron Importances for Structured Pruning

Structured pruning efficiently compresses networks by identifying and re...
research
08/19/2020

Data-Independent Structured Pruning of Neural Networks via Coresets

Model compression is crucial for deployment of neural networks on device...

Please sign up or login with your details

Forgot password? Click here to reset