To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks

03/14/2019
by   Matthew Peters, et al.
0

While most previous work has focused on different pretraining objectives and architectures for transfer learning, we ask how to best adapt the pretrained model to a given target task. We focus on the two most common forms of adaptation, feature extraction (where the pretrained weights are frozen), and directly fine-tuning the pretrained model. Our empirical results across diverse NLP tasks with two state-of-the-art models show that the relative performance of fine-tuning vs. feature extraction depends on the similarity of the pretraining and target tasks. We explore possible explanations for this finding and provide a set of adaptation guidelines for the NLP practitioner.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2018

Universal Language Model Fine-Tuning with Subword Tokenization for Polish

Universal Language Model for Fine-tuning [arXiv:1801.06146] (ULMFiT) is ...
research
04/27/2020

Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting

Deep pretrained language models have achieved great success in the way o...
research
12/12/2022

Parameter-Efficient Finetuning of Transformers for Source Code

Pretrained Transformers achieve state-of-the-art performance in various ...
research
04/13/2023

Lossless Adaptation of Pretrained Vision Models For Robotic Manipulation

Recent works have shown that large models pretrained on common visual le...
research
02/11/2023

Cross-Modal Fine-Tuning: Align then Refine

Fine-tuning large-scale pretrained models has led to tremendous progress...
research
05/02/2018

Unsupervised Learning using Pretrained CNN and Associative Memory Bank

Deep Convolutional features extracted from a comprehensive labeled datas...
research
11/14/2021

Time Waits for No One! Analysis and Challenges of Temporal Misalignment

When an NLP model is trained on text data from one time period and teste...

Please sign up or login with your details

Forgot password? Click here to reset