Partial Is Better Than All: Revisiting Fine-tuning Strategy for Few-shot Learning

02/08/2021
by   Zhiqiang Shen, et al.
0

The goal of few-shot learning is to learn a classifier that can recognize unseen classes from limited support data with labels. A common practice for this task is to train a model on the base set first and then transfer to novel classes through fine-tuning (Here fine-tuning procedure is defined as transferring knowledge from base to novel data, i.e. learning to transfer in few-shot scenario.) or meta-learning. However, as the base classes have no overlap to the novel set, simply transferring whole knowledge from base data is not an optimal solution since some knowledge in the base model may be biased or even harmful to the novel class. In this paper, we propose to transfer partial knowledge by freezing or fine-tuning particular layer(s) in the base model. Specifically, layers will be imposed different learning rates if they are chosen to be fine-tuned, to control the extent of preserved transferability. To determine which layers to be recast and what values of learning rates for them, we introduce an evolutionary search based method that is efficient to simultaneously locate the target layers and determine their individual learning rates. We conduct extensive experiments on CUB and mini-ImageNet to demonstrate the effectiveness of our proposed method. It achieves the state-of-the-art performance on both meta-learning and non-meta based frameworks. Furthermore, we extend our method to the conventional pre-training + fine-tuning paradigm and obtain consistent improvement.

READ FULL TEXT

page 1

page 3

12/16/2019

A New Benchmark for Evaluation of Cross-Domain Few-Shot Learning

Recent progress on few-shot learning has largely re-lied on annotated da...
01/28/2021

COMPAS: Representation Learning with Compositional Part Sharing for Few-Shot Classification

Few-shot image classification consists of two consecutive learning proce...
05/30/2021

Knowledge Transfer for Few-shot Segmentation of Novel White Matter Tracts

Convolutional neural networks (CNNs) have achieved stateof-the-art perfo...
03/04/2022

Voice-Face Homogeneity Tells Deepfake

Detecting forgery videos is highly desirable due to the abuse of deepfak...
04/15/2022

Pushing the Limits of Simple Pipelines for Few-Shot Learning: External Data and Fine-Tuning Make a Difference

Few-shot learning (FSL) is an important and topical problem in computer ...
09/12/2021

Knowledge Enhanced Fine-Tuning for Better Handling Unseen Entities in Dialogue Generation

Although pre-training models have achieved great success in dialogue gen...
08/11/2021

Prototype Completion for Few-Shot Learning

Few-shot learning aims to recognize novel classes with few examples. Pre...