-
SMILE: Self-Distilled MIxup for Efficient Transfer LEarning
To improve the performance of deep learning, mixup has been proposed to ...
read it
-
Deep Transfer Learning for WiFi Localization
This paper studies a WiFi indoor localisation technique based on using a...
read it
-
Explicit Inductive Bias for Transfer Learning with Convolutional Networks
In inductive transfer learning, fine-tuning pre-trained convolutional ne...
read it
-
Channel Scaling: A Scale-and-Select Approach for Transfer Learning
Transfer learning with pre-trained neural networks is a common strategy ...
read it
-
Backbone Can Not be Trained at Once: Rolling Back to Pre-trained Network for Person Re-Identification
In person re-identification (ReID) task, because of its shortage of trai...
read it
-
NetTailor: Tuning the Architecture, Not Just the Weights
Real-world applications of object recognition often require the solution...
read it
-
AutoFCL: Automatically Tuning Fully Connected Layers for Transfer Learning
Deep Convolutional Neural Networks (CNN) have evolved as popular machine...
read it
RIFLE: Backpropagation in Depth for Deep Transfer Learning through Re-Initializing the Fully-connected LayEr
Fine-tuning the deep convolution neural network(CNN) using a pre-trained model helps transfer knowledge learned from larger datasets to the target task. While the accuracy could be largely improved even when the training dataset is small, the transfer learning outcome is usually constrained by the pre-trained model with close CNN weights (Liu et al., 2019), as the backpropagation here brings smaller updates to deeper CNN layers. In this work, we propose RIFLE - a simple yet effective strategy that deepens backpropagation in transfer learning settings, through periodically Re-Initializing the Fully-connected LayEr with random scratch during the fine-tuning procedure. RIFLE brings meaningful updates to the weights of deep CNN layers and improves low-level feature learning, while the effects of randomization can be easily converged throughout the overall learning procedure. The experiments show that the use of RIFLE significantly improves deep transfer learning accuracy on a wide range of datasets, out-performing known tricks for the similar purpose, such as Dropout, DropConnect, StochasticDepth, Disturb Label and Cyclic Learning Rate, under the same settings with 0.5 ablation studies further indicate RIFLE brings meaningful updates to deep CNN layers with accuracy improved.
READ FULL TEXT
Comments
There are no comments yet.