Sequence Level Training with Recurrent Neural Networks

11/20/2015
by   Marc'Aurelio Ranzato, et al.
0

Many natural language processing applications use language models to generate text. These models are typically trained to predict the next word in a sequence, given the previous words and some context such as an image. However, at test time the model is expected to generate the entire sequence from scratch. This discrepancy makes generation brittle, as errors may accumulate along the way. We address this issue by proposing a novel sequence level training algorithm that directly optimizes the metric used at test time, such as BLEU or ROUGE. On three different tasks, our approach outperforms several strong baselines for greedy generation. The method is also competitive when these baselines employ beam search, while being several times faster.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2016

Sequence-to-Sequence Learning as Beam-Search Optimization

Sequence-to-Sequence (seq2seq) modeling has rapidly become an important ...
research
02/04/2021

Incremental Beam Manipulation for Natural Language Generation

The performance of natural language generation systems has improved subs...
research
08/31/2018

Nightmare at test time: How punctuation prevents parsers from generalizing

Punctuation is a strong indicator of syntactic structure, and parsers tr...
research
05/26/2019

TIGS: An Inference Algorithm for Text Infilling with Gradient Search

Text infilling is defined as a task for filling in the missing part of a...
research
01/22/2021

k-Neighbor Based Curriculum Sampling for Sequence Prediction

Multi-step ahead prediction in language models is challenging due to the...
research
06/11/2019

Parallel Scheduled Sampling

Auto-regressive models are widely used in sequence generation problems. ...
research
09/16/2018

Curriculum-Based Neighborhood Sampling For Sequence Prediction

The task of multi-step ahead prediction in language models is challengin...

Please sign up or login with your details

Forgot password? Click here to reset