Multiobjective Evolutionary Pruning of Deep Neural Networks with Transfer Learning for improving their Performance and Robustness

02/20/2023
by   Javier Poyatos, et al.
0

Evolutionary Computation algorithms have been used to solve optimization problems in relation with architectural, hyper-parameter or training configuration, forging the field known today as Neural Architecture Search. These algorithms have been combined with other techniques such as the pruning of Neural Networks, which reduces the complexity of the network, and the Transfer Learning, which lets the import of knowledge from another problem related to the one at hand. The usage of several criteria to evaluate the quality of the evolutionary proposals is also a common case, in which the performance and complexity of the network are the most used criteria. This work proposes MO-EvoPruneDeepTL, a multi-objective evolutionary pruning algorithm. uses Transfer Learning to adapt the last layers of Deep Neural Networks, by replacing them with sparse layers evolved by a genetic algorithm, which guides the evolution based in the performance, complexity and robustness of the network, being the robustness a great quality indicator for the evolved models. We carry out different experiments with several datasets to assess the benefits of our proposal. Results show that our proposal achieves promising results in all the objectives, and direct relation are presented among them. The experiments also show that the most influential neurons help us explain which parts of the input images are the most relevant for the prediction of the pruned neural network. Lastly, by virtue of the diversity within the Pareto front of pruning patterns produced by the proposal, it is shown that an ensemble of differently pruned models improves the overall performance and robustness of the trained networks.

READ FULL TEXT

page 11

page 17

page 18

page 19

page 20

page 21

page 22

research
02/08/2022

EvoPruneDeepTL: An Evolutionary Pruning Model for Transfer Learning based Deep Neural Networks

In recent years, Deep Learning models have shown a great performance in ...
research
10/22/2022

Sub-network Multi-objective Evolutionary Algorithm for Filter Pruning

Filter pruning is a common method to achieve model compression and accel...
research
03/07/2020

Sampled Training and Node Inheritance for Fast Evolutionary Neural Architecture Search

The performance of a deep neural network is heavily dependent on its arc...
research
07/27/2021

Experiments on Properties of Hidden Structures of Sparse Neural Networks

Sparsity in the structure of Neural Networks can lead to less energy con...
research
05/25/2022

Concurrent Neural Tree and Data Preprocessing AutoML for Image Classification

Deep Neural Networks (DNN's) are a widely-used solution for a variety of...
research
01/30/2017

PathNet: Evolution Channels Gradient Descent in Super Neural Networks

For artificial general intelligence (AGI) it would be efficient if multi...
research
08/31/2020

Efficient and Sparse Neural Networks by Pruning Weights in a Multiobjective Learning Approach

Overparameterization and overfitting are common concerns when designing ...

Please sign up or login with your details

Forgot password? Click here to reset