Structured Prompt Tuning

05/24/2022
by   Chi-Liang Liu, et al.
2

We propose structured prompt tuning, a simple and effective method to improve prompt tuning. Instead of prepending a sequence of tunable embeddings to the input, we generate the soft prompt embeddings through a hypernetwork. Our approach subsumes the standard prompt tuning, allows more flexibility in model design and can be applied to both single-task and multi-task training settings. Empirically, structured prompt tuning shows a gain of +1.21.5 points on the GLUE benchmark and is less sensitive to the change of learning rate, compared to standard prompt tuning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/06/2023

Residual Prompt Tuning: Improving Prompt Tuning with Residual Reparameterization

Prompt tuning is one of the successful approaches for parameter-efficien...
research
01/28/2023

Non-Hermitian Physics-Inspired Voltage-Controlled Oscillators with Resistive Tuning

This paper presents a non-Hermitian physics-inspired voltage-controlled ...
research
08/01/2018

Structured Differential Learning for Automatic Threshold Setting

We introduce a technique that can automatically tune the parameters of a...
research
10/25/2019

A Simple Dynamic Learning Rate Tuning Algorithm For Automated Training of DNNs

Training neural networks on image datasets generally require extensive e...
research
02/16/2023

Learning to Initialize: Can Meta Learning Improve Cross-task Generalization in Prompt Tuning?

Prompt tuning (PT) which only tunes the embeddings of an additional sequ...
research
04/12/2023

Global Prompt Cell: A Portable Control Module for Effective Prompt

As a novel approach to tuning pre-trained models, prompt tuning involves...
research
05/14/2018

Parser Training with Heterogeneous Treebanks

How to make the most of multiple heterogeneous treebanks when training a...

Please sign up or login with your details

Forgot password? Click here to reset