Generating Diverse Translation by Manipulating Multi-Head Attention

11/21/2019
by   Zewei Sun, et al.
0

Transformer model has been widely used on machine translation tasks and obtained state-of-the-art results. In this paper, we report an interesting phenomenon in its encoder-decoder multi-head attention: different attention heads of the final decoder layer align to different word translation candidates. We empirically verify this discovery and propose a method to generate diverse translations by manipulating heads. Furthermore, we make use of these diverse translations with the back-translation technique for better data augmentation. Experiment results show that our method generates diverse translations without severe drop in translation quality. Experiments also show that back-translation with these diverse translations could bring significant improvement on performance on translation tasks. An auxiliary experiment of conversation response generation task proves the effect of diversity as well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2019

Two Way Adversarial Unsupervised Word Translation

Word translation is a problem in machine translation that seeks to build...
research
09/08/2021

Mixup Decoding for Diverse Machine Translation

Diverse machine translation aims at generating various target language t...
research
03/16/2021

Gumbel-Attention for Multi-modal Machine Translation

Multi-modal machine translation (MMT) improves translation quality by in...
research
10/16/2020

Generating Diverse Translation from Model Distribution with Dropout

Despite the improvement of translation quality, neural machine translati...
research
12/03/2018

Learning Multimodal Graph-to-Graph Translation for Molecular Optimization

We view molecular optimization as a graph-to-graph translation problem. ...
research
02/04/2023

Greedy Ordering of Layer Weight Matrices in Transformers Improves Translation

Prior work has attempted to understand the internal structures and funct...
research
04/22/2020

DeepSubQE: Quality estimation for subtitle translations

Quality estimation (QE) for tasks involving language data is hard owing ...

Please sign up or login with your details

Forgot password? Click here to reset