Ahead-of-Time P-Tuning

05/18/2023
by   Daniil Gavrilov, et al.
0

In this paper, we propose Ahead-of-Time (AoT) P-Tuning, a novel parameter-efficient fine-tuning method for pre-trained Language Models (LMs) that adds input-dependent bias before each Transformer layer. We evaluate AoT P-Tuning on GLUE and SuperGLUE benchmarking datasets using RoBERTa and DeBERTa models, showing that it outperforms BitFit and is comparable or better than other baseline methods for efficient fine-tuning. Additionally, we assess the inference overhead of AoT P-Tuning and demonstrate that it introduces negligible overhead compared to established baseline methods. Our method enables multi-task inference with a single backbone LM, making it a practical solution for real-world applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2021

BitFit: Simple Parameter-efficient Fine-tuning for Transformer-based Masked Language-models

We show that with small-to-medium training data, fine-tuning only the bi...
research
03/23/2022

Visual Prompt Tuning

The current modus operandi in adapting pre-trained models involves updat...
research
11/16/2022

Parameter-Efficient Tuning on Layer Normalization for Pre-trained Language Models

Conventional fine-tuning encounters increasing difficulties given the si...
research
05/24/2023

Towards Adaptive Prefix Tuning for Parameter-Efficient Language Model Fine-tuning

Fine-tuning large pre-trained language models on various downstream task...
research
05/19/2021

Single-Layer Vision Transformers for More Accurate Early Exits with Less Overhead

Deploying deep learning models in time-critical applications with limite...
research
10/14/2021

P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks

Prompt tuning, which only tunes continuous prompts with a frozen languag...
research
07/19/2023

Improving Pre-trained Language Models' Generalization

The reusability of state-of-the-art Pre-trained Language Models (PLMs) i...

Please sign up or login with your details

Forgot password? Click here to reset