PLATO: Pre-trained Dialogue Generation Model with Discrete Latent Variable

10/17/2019
by   Siqi Bao, et al.
0

Pre-training models have been proved effective for a wide range of natural language processing tasks. Inspired by this, we propose a novel dialogue generation pre-training framework to support various kinds of conversations, including chit-chat, knowledge grounded dialogues, and conversational question answering. In this framework, we adopt flexible attention mechanisms to fully leverage the bi-directional context and the uni-directional characteristic of language generation. We also introduce discrete latent variables to tackle with the natural born one-to-many mapping problem in response generation. Two reciprocal tasks of response generation and latent act recognition are designed and carried out simultaneously within a shared network. Comprehensive experiments on three publicly available datasets verify the effectiveness and superiority of the proposed framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2022

Semantic-based Pre-training for Dialogue Understanding

Pre-trained language models have made great progress on dialogue tasks. ...
research
09/20/2021

PLATO-XL: Exploring the Large-scale Pre-training of Dialogue Generation

To explore the limit of dialogue generation pre-training, we present the...
research
01/26/2020

ERNIE-GEN: An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation

Current pre-training works in natural language generation pay little att...
research
04/27/2022

DialogVED: A Pre-trained Latent Variable Encoder-Decoder Model for Dialog Response Generation

Dialog response generation in open domain is an important research topic...
research
04/14/2020

PALM: Pre-training an Autoencoding Autoregressive Language Model for Context-conditioned Generation

Self-supervised pre-training has emerged as a powerful technique for nat...
research
02/22/2023

Guiding Large Language Models via Directional Stimulus Prompting

We introduce a new framework, Directional Stimulus Prompting, that uses ...
research
07/01/2023

BatGPT: A Bidirectional Autoregessive Talker from Generative Pre-trained Transformer

BatGPT is a large-scale language model designed and trained jointly by W...

Please sign up or login with your details

Forgot password? Click here to reset