Confidence Based Bidirectional Global Context Aware Training Framework for Neural Machine Translation

02/28/2022
by   Chulun Zhou, et al.
0

Most dominant neural machine translation (NMT) models are restricted to make predictions only according to the local context of preceding words in a left-to-right manner. Although many previous studies try to incorporate global information into NMT models, there still exist limitations on how to effectively exploit bidirectional global context. In this paper, we propose a Confidence Based Bidirectional Global Context Aware (CBBGCA) training framework for NMT, where the NMT model is jointly trained with an auxiliary conditional masked language model (CMLM). The training consists of two stages: (1) multi-task joint training; (2) confidence based knowledge distillation. At the first stage, by sharing encoder parameters, the NMT model is additionally supervised by the signal from the CMLM decoder that contains bidirectional global contexts. Moreover, at the second stage, using the CMLM as teacher, we further pertinently incorporate bidirectional global context to the NMT model on its unconfidently-predicted target words via knowledge distillation. Experimental results show that our proposed CBBGCA training framework significantly improves the NMT model by +1.02, +1.30 and +0.57 BLEU scores on three large-scale translation datasets, namely WMT'14 English-to-German, WMT'19 Chinese-to-English and WMT'14 English-to-French, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2017

Exploiting Cross-Sentence Context for Neural Machine Translation

In translation, considering the document as a whole can help to resolve ...
research
01/16/2018

Asynchronous Bidirectional Decoding for Neural Machine Translation

The dominant neural machine translation (NMT) models apply unified atten...
research
12/16/2019

Iterative Dual Domain Adaptation for Neural Machine Translation

Previous studies on the domain adaptation for neural machine translation...
research
10/04/2017

Enhanced Neural Machine Translation by Learning from Draft

Neural machine translation (NMT) has recently achieved impressive result...
research
06/12/2021

Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation

Although teacher forcing has become the main training paradigm for neura...
research
11/25/2019

Learning to Reuse Translations: Guiding Neural Machine Translation with Examples

In this paper, we study the problem of enabling neural machine translati...
research
03/06/2022

Conditional Bilingual Mutual Information Based Adaptive Training for Neural Machine Translation

Token-level adaptive training approaches can alleviate the token imbalan...

Please sign up or login with your details

Forgot password? Click here to reset