RIFLE: Backpropagation in Depth for Deep Transfer Learning through Re-Initializing the Fully-connected LayEr

07/07/2020
by   Xingjian Li, et al.
0

Fine-tuning the deep convolution neural network(CNN) using a pre-trained model helps transfer knowledge learned from larger datasets to the target task. While the accuracy could be largely improved even when the training dataset is small, the transfer learning outcome is usually constrained by the pre-trained model with close CNN weights (Liu et al., 2019), as the backpropagation here brings smaller updates to deeper CNN layers. In this work, we propose RIFLE - a simple yet effective strategy that deepens backpropagation in transfer learning settings, through periodically Re-Initializing the Fully-connected LayEr with random scratch during the fine-tuning procedure. RIFLE brings meaningful updates to the weights of deep CNN layers and improves low-level feature learning, while the effects of randomization can be easily converged throughout the overall learning procedure. The experiments show that the use of RIFLE significantly improves deep transfer learning accuracy on a wide range of datasets, out-performing known tricks for the similar purpose, such as Dropout, DropConnect, StochasticDepth, Disturb Label and Cyclic Learning Rate, under the same settings with 0.5 ablation studies further indicate RIFLE brings meaningful updates to deep CNN layers with accuracy improved.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2021

CARTL: Cooperative Adversarially-Robust Transfer Learning

Transfer learning eases the burden of training a well-performed model fr...
research
03/25/2021

SMILE: Self-Distilled MIxup for Efficient Transfer LEarning

To improve the performance of deep learning, mixup has been proposed to ...
research
03/08/2021

Deep Transfer Learning for WiFi Localization

This paper studies a WiFi indoor localisation technique based on using a...
research
02/05/2018

Explicit Inductive Bias for Transfer Learning with Convolutional Networks

In inductive transfer learning, fine-tuning pre-trained convolutional ne...
research
01/22/2020

AutoFCL: Automatically Tuning Fully Connected Layers for Transfer Learning

Deep Convolutional Neural Networks (CNN) have evolved as popular machine...
research
03/22/2021

Channel Scaling: A Scale-and-Select Approach for Transfer Learning

Transfer learning with pre-trained neural networks is a common strategy ...
research
01/18/2019

Backbone Can Not be Trained at Once: Rolling Back to Pre-trained Network for Person Re-Identification

In person re-identification (ReID) task, because of its shortage of trai...

Please sign up or login with your details

Forgot password? Click here to reset