CoNT: Contrastive Neural Text Generation

05/29/2022
by   Chenxin An, et al.
0

Recently, contrastive learning attracts increasing interests in neural text generation as a new solution to alleviate the exposure bias problem. It introduces a sequence-level training signal which is crucial to generation tasks that always rely on auto-regressive decoding. However, previous methods using contrastive learning in neural text generation usually lead to inferior performance. In this paper, we analyse the underlying reasons and propose a new Contrastive Neural Text generation framework, CoNT. CoNT addresses bottlenecks that prevent contrastive learning from being widely adopted in generation tasks from three aspects – the construction of contrastive examples, the choice of the contrastive loss, and the strategy in decoding. We validate CoNT on five generation tasks with ten benchmarks, including machine translation, summarization, code comment generation, data-to-text generation and commonsense generation. Experimental results show that CoNT clearly outperforms the conventional training framework on all the ten benchmarks with a convincing margin. Especially, CoNT surpasses previous the most competitive contrastive learning method for text generation, by 1.50 BLEU on machine translation and 1.77 ROUGE-1 on summarization, respectively. It achieves new state-of-the-art on summarization, code comment generation (without external data) and data-to-text generation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2020

Modern Methods for Text Generation

Synthetic text generation is challenging and has limited success. Recent...
research
12/14/2020

Contrastive Learning with Adversarial Perturbations for Conditional Text Generation

Recently, sequence-to-sequence (seq2seq) models with the Transformer arc...
research
06/03/2021

SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization

In this paper, we present a conceptually simple while empirically powerf...
research
01/23/2023

Semantic-aware Contrastive Learning for Electroencephalography-to-Text Generation with Curriculum Learning

Electroencephalography-to-Text generation (EEG-to-Text), which aims to d...
research
06/06/2023

Click: Controllable Text Generation with Sequence Likelihood Contrastive Learning

It has always been an important yet challenging problem to control langu...
research
09/16/2020

Text Generation by Learning from Off-Policy Demonstrations

Current approaches to text generation largely rely on autoregressive mod...
research
04/22/2020

Residual Energy-Based Models for Text Generation

Text generation is ubiquitous in many NLP tasks, from summarization, to ...

Please sign up or login with your details

Forgot password? Click here to reset