Incorporating BERT into Neural Machine Translation

02/17/2020
by   Jinhua Zhu, et al.
0

The recently proposed BERT has shown great power on a variety of natural language understanding tasks, such as text classification, reading comprehension, etc. However, how to effectively apply BERT to neural machine translation (NMT) lacks enough exploration. While BERT is more commonly used as fine-tuning instead of contextual embedding for downstream language understanding tasks, in NMT, our preliminary exploration of using BERT as contextual embedding is better than using for fine-tuning. This motivates us to think how to better leverage BERT for NMT along this direction. We propose a new algorithm named BERT-fused model, in which we first use BERT to extract representations for an input sequence, and then the representations are fused with each layer of the encoder and decoder of the NMT model through attention mechanisms. We conduct experiments on supervised (including sentence-level and document-level translations), semi-supervised and unsupervised machine translation, and achieve state-of-the-art results on seven benchmark datasets. Our code is available at <https://github.com/bert-nmt/bert-nmt>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2020

BERT-JAM: Boosting BERT-Enhanced Neural Machine Translation with Joint Attention

BERT-enhanced neural machine translation (NMT) aims at leveraging BERT-e...
research
04/07/2021

Better Neural Machine Translation by Extracting Linguistic Information from BERT

Adding linguistic information (syntax or semantics) to neural machine tr...
research
10/24/2022

Finding Memo: Extractive Memorization in Constrained Sequence Generation Tasks

Memorization presents a challenge for several constrained Natural Langua...
research
10/15/2020

Pronoun-Targeted Fine-tuning for NMT with Hybrid Losses

Popular Neural Machine Translation model training uses strategies like b...
research
10/12/2020

TextHide: Tackling Data Privacy in Language Understanding Tasks

An unsolved challenge in distributed or federated learning is to effecti...
research
09/14/2022

Toward Improving Health Literacy in Patient Education Materials with Neural Machine Translation Models

Health literacy is the central focus of Healthy People 2030, the fifth i...
research
03/23/2021

Repairing Pronouns in Translation with BERT-Based Post-Editing

Pronouns are important determinants of a text's meaning but difficult to...

Please sign up or login with your details

Forgot password? Click here to reset