Sentence-wise Smooth Regularization for Sequence to Sequence Learning

12/12/2018
by   Chengyue Gong, et al.
0

Maximum-likelihood estimation (MLE) is widely used in sequence to sequence tasks for model training. It uniformly treats the generation/prediction of each target token as multi-class classification, and yields non-smooth prediction probabilities: in a target sequence, some tokens are predicted with small probabilities while other tokens are with large probabilities. According to our empirical study, we find that the non-smoothness of the probabilities results in low quality of generated sequences. In this paper, we propose a sentence-wise regularization method which aims to output smooth prediction probabilities for all the tokens in the target sequence. Our proposed method can automatically adjust the weights and gradients of each token in one sentence to ensure the predictions in a sequence uniformly well. Experiments on three neural machine translation tasks and one text summarization task show that our method outperforms conventional MLE loss on all these tasks and achieves promising BLEU scores on WMT14 English-German and WMT17 Chinese-English translation task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/13/2017

QuickEdit: Editing Text & Translations via Simple Delete Actions

We propose a framework for computer-assisted text editing. It applies to...
research
07/01/2021

Modeling Target-side Inflection in Placeholder Translation

Placeholder translation systems enable the users to specify how a specif...
research
11/19/2017

Incorporating Syntactic Uncertainty in Neural Machine Translation with Forest-to-Sequence Model

Incorporating syntactic information in Neural Machine Translation models...
research
05/26/2021

Bilingual Mutual Information Based Adaptive Training for Neural Machine Translation

Recently, token-level adaptive training has achieved promising improveme...
research
09/12/2023

Glancing Future for Simultaneous Machine Translation

Simultaneous machine translation (SiMT) outputs translation while readin...
research
12/21/2022

Reconstruction Probing

We propose reconstruction probing, a new analysis method for contextuali...
research
06/09/2015

Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks

Recurrent Neural Networks can be trained to produce sequences of tokens ...

Please sign up or login with your details

Forgot password? Click here to reset