Gradual Tuning: a better way of Fine Tuning the parameters of a Deep Neural Network

11/28/2017
by   Guglielmo Montone, et al.
0

In this paper we present an alternative strategy for fine-tuning the parameters of a network. We named the technique Gradual Tuning. Once trained on a first task, the network is fine-tuned on a second task by modifying a progressively larger set of the network's parameters. We test Gradual Tuning on different transfer learning tasks, using networks of different sizes trained with different regularization techniques. The result shows that compared to the usual fine tuning, our approach significantly reduces catastrophic forgetting of the initial task, while still retaining comparable if not better performance on the new task.

READ FULL TEXT
research
11/21/2019

AdaFilter: Adaptive Filter Fine-tuning for Deep Transfer Learning

There is an increasing number of pre-trained deep neural network models....
research
04/29/2020

Avoiding catastrophic forgetting in mitigating model biases in sentence-pair classification with elastic weight consolidation

The biases present in training datasets have been shown to be affecting ...
research
12/31/2019

Side-Tuning: Network Adaptation via Additive Side Networks

When training a neural network for a desired task, one may prefer to ada...
research
07/20/2022

Pretraining a Neural Network before Knowing Its Architecture

Training large neural networks is possible by training a smaller hyperne...
research
06/29/2016

Learning without Forgetting

When building a unified vision system or gradually adding new capabiliti...
research
05/25/2019

Efficient Neural Task Adaptation by Maximum Entropy Initialization

Transferring knowledge from one neural network to another has been shown...
research
08/03/2020

Incorrect by Construction: Fine Tuning Neural Networks for Guaranteed Performance on Finite Sets of Examples

There is great interest in using formal methods to guarantee the reliabi...

Please sign up or login with your details

Forgot password? Click here to reset