Learning to Transfer Prompts for Text Generation

05/03/2022
by   Junyi Li, et al.
5

Pretrained language models (PLMs) have made remarkable progress in text generation tasks via fine-tuning. While, it is challenging to fine-tune PLMs in a data-scarce situation. Therefore, it is non-trivial to develop a general and lightweight model that can adapt to various text generation tasks based on PLMs. To fulfill this purpose, the recent prompt-based learning offers a potential solution. In this paper, we improve this technique and propose a novel prompt-based method (PTG) for text generation in a transferable setting. First, PTG learns a set of source prompts for various source generation tasks and then transfers these prompts as target prompts to perform target generation tasks. To consider both task- and instance-level information, we design an adaptive attention mechanism to derive the target prompts. For each data instance, PTG learns a specific target prompt by attending to highly relevant source prompts. In extensive experiments, PTG yields competitive or better results than fine-tuning methods. We release our source prompts as an open resource, where users can add or reuse them to improve new text generation tasks for future research. Code and data can be available at https://github.com/RUCAIBox/Transfer-Prompts-for-Text-Generation.

READ FULL TEXT
research
05/21/2021

Pretrained Language Models for Text Generation: A Survey

Text generation has become one of the most important yet challenging tas...
research
06/28/2023

Most Language Models can be Poets too: An AI Writing Assistant and Constrained Text Generation Studio

Despite rapid advancement in the field of Constrained Natural Language G...
research
04/17/2020

Rigid Formats Controlled Text Generation

Neural text generation has made tremendous progress in various tasks. On...
research
12/29/2020

A Theoretical Analysis of the Repetition Problem in Text Generation

Text generation tasks, including translation, summarization, language mo...
research
12/10/2021

Discourse-Aware Prompt Design for Text Generation

Current efficient fine-tuning methods (e.g., adapters, prefix-tuning, et...
research
01/31/2023

Execution-based Code Generation using Deep Reinforcement Learning

The utilization of programming language (PL) models, pretrained on large...
research
04/20/2022

Event Transition Planning for Open-ended Text Generation

Open-ended text generation tasks, such as dialogue generation and story ...

Please sign up or login with your details

Forgot password? Click here to reset