Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?

08/12/2023
by   Zhu Liao, et al.
0

Pruning is a widely used technique for reducing the size of deep neural networks while maintaining their performance. However, such a technique, despite being able to massively compress deep models, is hardly able to remove entire layers from a model (even when structured): is this an addressable task? In this study, we introduce EGP, an innovative Entropy Guided Pruning algorithm aimed at reducing the size of deep neural networks while preserving their performance. The key focus of EGP is to prioritize pruning connections in layers with low entropy, ultimately leading to their complete removal. Through extensive experiments conducted on popular models like ResNet-18 and Swin-T, our findings demonstrate that EGP effectively compresses deep neural networks while maintaining competitive performance levels. Our results not only shed light on the underlying mechanism behind the advantages of unstructured pruning, but also pave the way for further investigations into the intricate relationship between entropy, pruning techniques, and deep learning performance. The EGP algorithm and its insights hold great promise for advancing the field of network compression and optimization. The source code for EGP is released open-source.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2020

Quantisation and Pruning for Neural Network Compression and Regularisation

Deep neural networks are typically too computationally expensive to run ...
research
03/01/2023

Structured Pruning for Deep Convolutional Neural Networks: A survey

The remarkable performance of deep Convolutional neural networks (CNNs) ...
research
02/25/2019

The State of Sparsity in Deep Neural Networks

We rigorously evaluate three state-of-the-art techniques for inducing sp...
research
02/08/2022

EvoPruneDeepTL: An Evolutionary Pruning Model for Transfer Learning based Deep Neural Networks

In recent years, Deep Learning models have shown a great performance in ...
research
06/22/2020

Revisiting Loss Modelling for Unstructured Pruning

By removing parameters from deep neural networks, unstructured pruning m...
research
08/04/2023

Pruning a neural network using Bayesian inference

Neural network pruning is a highly effective technique aimed at reducing...
research
07/27/2019

Learning Instance-wise Sparsity for Accelerating Deep Models

Exploring deep convolutional neural networks of high efficiency and low ...

Please sign up or login with your details

Forgot password? Click here to reset