EvoPruneDeepTL: An Evolutionary Pruning Model for Transfer Learning based Deep Neural Networks

02/08/2022
by   Javier Poyatos, et al.
0

In recent years, Deep Learning models have shown a great performance in complex optimization problems. They generally require large training datasets, which is a limitation in most practical cases. Transfer learning allows importing the first layers of a pre-trained architecture and connecting them to fully-connected layers to adapt them to a new problem. Consequently, the configuration of the these layers becomes crucial for the performance of the model. Unfortunately, the optimization of these models is usually a computationally demanding task. One strategy to optimize Deep Learning models is the pruning scheme. Pruning methods are focused on reducing the complexity of the network, assuming an expected performance penalty of the model once pruned. However, the pruning could potentially be used to improve the performance, using an optimization algorithm to identify and eventually remove unnecessary connections among neurons. This work proposes EvoPruneDeepTL, an evolutionary pruning model for Transfer Learning based Deep Neural Networks which replaces the last fully-connected layers with sparse layers optimized by a genetic algorithm. Depending on its solution encoding strategy, our proposed model can either perform optimized pruning or feature selection over the densely connected part of the neural network. We carry out different experiments with several datasets to assess the benefits of our proposal. Results show the contribution of EvoPruneDeepTL and feature selection to the overall computational efficiency of the network as a result of the optimization process. In particular, the accuracy is improved reducing at the same time the number of active neurons in the final layers.

READ FULL TEXT
research
02/20/2023

Multiobjective Evolutionary Pruning of Deep Neural Networks with Transfer Learning for improving their Performance and Robustness

Evolutionary Computation algorithms have been used to solve optimization...
research
07/22/2015

Data-free parameter pruning for Deep Neural Networks

Deep Neural nets (NNs) with millions of parameters are at the heart of m...
research
08/12/2023

Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?

Pruning is a widely used technique for reducing the size of deep neural ...
research
11/19/2018

An Efficient Transfer Learning Technique by Using Final Fully-Connected Layer Output Features of Deep Networks

In this paper, we propose a computationally efficient transfer learning ...
research
06/27/2019

On improving deep learning generalization with adaptive sparse connectivity

Large neural networks are very successful in various tasks. However, wit...
research
03/12/2018

FeTa: A DCA Pruning Algorithm with Generalization Error Guarantees

Recent DNN pruning algorithms have succeeded in reducing the number of p...
research
01/21/2019

Partition Pruning: Parallelization-Aware Pruning for Deep Neural Networks

Parameters of recent neural networks require a huge amount of memory. Th...

Please sign up or login with your details

Forgot password? Click here to reset