An Experimental Study of the Impact of Pre-training on the Pruning of a Convolutional Neural Network

12/15/2021
by   Nathan Hubens, et al.
3

In recent years, deep neural networks have known a wide success in various application domains. However, they require important computational and memory resources, which severely hinders their deployment, notably on mobile devices or for real-time applications. Neural networks usually involve a large number of parameters, which correspond to the weights of the network. Such parameters, obtained with the help of a training process, are determinant for the performance of the network. However, they are also highly redundant. The pruning methods notably attempt to reduce the size of the parameter set, by identifying and removing the irrelevant weights. In this paper, we examine the impact of the training strategy on the pruning efficiency. Two training modalities are considered and compared: (1) fine-tuned and (2) from scratch. The experimental results obtained on four datasets (CIFAR10, CIFAR100, SVHN and Caltech101) and for two different CNNs (VGG16 and MobileNet) demonstrate that a network that has been pre-trained on a large corpus (e.g. ImageNet) and then fine-tuned on a particular dataset can be pruned much more efficiently (up to 80

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2018

Pruning neural networks: is it time to nip it in the bud?

Pruning is a popular technique for compressing a neural network: a large...
research
07/28/2017

Fine-Pruning: Joint Fine-Tuning and Compression of a Convolutional Network with Bayesian Optimization

When approaching a novel visual recognition problem in a specialized ima...
research
09/21/2023

Cluster-based pruning techniques for audio data

Deep learning models have become widely adopted in various domains, but ...
research
04/13/2020

Classifying CMB time-ordered data through deep neural networks

The Cosmic Microwave Background (CMB) has been measured over a wide rang...
research
10/21/2021

Memory Efficient Adaptive Attention For Multiple Domain Learning

Training CNNs from scratch on new domains typically demands large number...
research
07/18/2023

Neural Network Pruning as Spectrum Preserving Process

Neural networks have achieved remarkable performance in various applicat...
research
03/22/2018

What do Deep Networks Like to See?

We propose a novel way to measure and understand convolutional neural ne...

Please sign up or login with your details

Forgot password? Click here to reset