Weight Reparametrization for Budget-Aware Network Pruning

07/08/2021
by   Robin Dupont, et al.
0

Pruning seeks to design lightweight architectures by removing redundant weights in overparameterized networks. Most of the existing techniques first remove structured sub-networks (filters, channels,...) and then fine-tune the resulting networks to maintain a high accuracy. However, removing a whole structure is a strong topological prior and recovering the accuracy, with fine-tuning, is highly cumbersome. In this paper, we introduce an "end-to-end" lightweight network design that achieves training and pruning simultaneously without fine-tuning. The design principle of our method relies on reparametrization that learns not only the weights but also the topological structure of the lightweight sub-network. This reparametrization acts as a prior (or regularizer) that defines pruning masks implicitly from the weights of the underlying network, without increasing the number of training parameters. Sparsity is induced with a budget loss that provides an accurate pruning. Extensive experiments conducted on the CIFAR10 and the TinyImageNet datasets, using standard architectures (namely Conv4, VGG19 and ResNet18), show compelling results without fine-tuning.

READ FULL TEXT
research
03/05/2020

Comparing Rewinding and Fine-tuning in Neural Network Pruning

Many neural network pruning algorithms proceed in three steps: train the...
research
02/25/2022

Extracting Effective Subnetworks with Gumebel-Softmax

Large and performant neural networks are often overparameterized and can...
research
05/28/2023

Pruning Meets Low-Rank Parameter-Efficient Fine-Tuning

Large pre-trained models (LPMs), such as LLaMA and ViT-G, have shown exc...
research
06/02/2020

Shapley Value as Principled Metric for Structured Network Pruning

Structured pruning is a well-known technique to reduce the storage size ...
research
10/22/2021

Probabilistic fine-tuning of pruning masks and PAC-Bayes self-bounded learning

We study an approach to learning pruning masks by optimizing the expecte...
research
02/17/2023

A New Baseline for GreenAI: Finding the Optimal Sub-Network via Layer and Channel Pruning

The concept of Green AI has been gaining attention within the deep learn...
research
07/05/2021

One-Cycle Pruning: Pruning ConvNets Under a Tight Training Budget

Introducing sparsity in a neural network has been an efficient way to re...

Please sign up or login with your details

Forgot password? Click here to reset