On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation

06/06/2021
by   Ruidan He, et al.
0

Adapter-based tuning has recently arisen as an alternative to fine-tuning. It works by adding light-weight adapter modules to a pretrained language model (PrLM) and only updating the parameters of adapter modules when learning on a downstream task. As such, it adds only a few trainable parameters per new task, allowing a high degree of parameter sharing. Prior studies have shown that adapter-based tuning often achieves comparable results to fine-tuning. However, existing work only focuses on the parameter-efficient aspect of adapter-based tuning while lacking further investigation on its effectiveness. In this paper, we study the latter. We first show that adapter-based tuning better mitigates forgetting issues than fine-tuning since it yields representations with less deviation from those generated by the initial PrLM. We then empirically compare the two tuning methods on several downstream NLP tasks and settings. We demonstrate that 1) adapter-based tuning outperforms fine-tuning on low-resource and cross-lingual tasks; 2) it is more robust to overfitting and less sensitive to changes in learning rates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2022

Know Where You're Going: Meta-Learning for Parameter-Efficient Fine-tuning

A recent family of techniques, dubbed as lightweight fine-tuning methods...
research
03/05/2022

Unfreeze with Care: Space-Efficient Fine-Tuning of Semantic Parsing Models

Semantic parsing is a key NLP task that maps natural language to structu...
research
01/28/2023

AutoPEFT: Automatic Configuration Search for Parameter-Efficient Fine-Tuning

Large pretrained language models have been widely used in downstream NLP...
research
04/13/2023

Lossless Adaptation of Pretrained Vision Models For Robotic Manipulation

Recent works have shown that large models pretrained on common visual le...
research
03/23/2023

Parameter-Efficient Sparse Retrievers and Rerankers using Adapters

Parameter-Efficient transfer learning with Adapters have been studied in...
research
02/11/2023

How to prepare your task head for finetuning

In deep learning, transferring information from a pretrained network to ...
research
08/28/2023

SAM-PARSER: Fine-tuning SAM Efficiently by Parameter Space Reconstruction

Segment Anything Model (SAM) has received remarkable attention as it off...

Please sign up or login with your details

Forgot password? Click here to reset