Attention Forcing for Machine Translation

04/02/2021
by   Qingyun Dou, et al.
0

Auto-regressive sequence-to-sequence models with attention mechanisms have achieved state-of-the-art performance in various tasks including Text-To-Speech (TTS) and Neural Machine Translation (NMT). The standard training approach, teacher forcing, guides a model with the reference output history. At inference stage, the generated output history must be used. This mismatch can impact performance. However, it is highly challenging to train the model using the generated output. Several approaches have been proposed to address this problem, normally by selectively using the generated output history. To make training stable, these approaches often require a heuristic schedule or an auxiliary classifier. This paper introduces attention forcing for NMT. This approach guides the model with the generated output history and reference attention, and can reduce the training-inference mismatch without a schedule or a classifier. Attention forcing has been successful in TTS, but its application to NMT is more challenging, due to the discrete and multi-modal nature of the output space. To tackle this problem, this paper adds a selection scheme to vanilla attention forcing, which automatically selects a suitable training approach for each pair of training data. Experiments show that attention forcing can improve the overall translation quality and the diversity of the translations.

READ FULL TEXT

page 5

page 7

research
09/26/2019

Attention Forcing for Sequence-to-sequence Model Training

Auto-regressive sequence-to-sequence models with attention mechanism hav...
research
11/06/2022

Parallel Attention Forcing for Machine Translation

Attention-based autoregressive models have achieved state-of-the-art per...
research
02/06/2018

Decoding-History-Based Adaptive Control of Attention for Neural Machine Translation

Attention-based sequence-to-sequence model has proved successful in Neur...
research
04/08/2022

C-NMT: A Collaborative Inference Framework for Neural Machine Translation

Collaborative Inference (CI) optimizes the latency and energy consumptio...
research
11/06/2022

Deliberation Networks and How to Train Them

Deliberation networks are a family of sequence-to-sequence models, which...
research
02/10/2015

Show, Attend and Tell: Neural Image Caption Generation with Visual Attention

Inspired by recent work in machine translation and object detection, we ...
research
11/26/2017

Learning to Remember Translation History with a Continuous Cache

Existing neural machine translation (NMT) models generally translate sen...

Please sign up or login with your details

Forgot password? Click here to reset