Scheduled Multi-Task Learning: From Syntax to Translation

04/24/2018
by   Eliyahu Kiperwasser, et al.
0

Neural encoder-decoder models of machine translation have achieved impressive results, while learning linguistic knowledge of both the source and target languages in an implicit end-to-end manner. We propose a framework in which our model begins learning syntax and translation interleaved, gradually putting more focus on translation. Using this approach, we achieve considerable improvements in terms of BLEU score on relatively large parallel corpus (WMT14 English to German) and a low-resource (WIT German to English) setup.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/05/2016

Multi-Source Neural Translation

We build a multi-source machine translation model and train it to maximi...
research
09/11/2015

Verbs Taking Clausal and Non-Finite Arguments as Signals of Modality - Revisiting the Issue of Meaning Grounded in Syntax

We revisit Levin's theory about the correspondence of verb meaning and s...
research
10/22/2020

CUNI Systems for the Unsupervised and Very Low Resource Translation Task in WMT20

This paper presents a description of CUNI systems submitted to the WMT20...
research
02/13/2018

Structured-based Curriculum Learning for End-to-end English-Japanese Speech Translation

Sequence-to-sequence attentional-based neural network architectures have...
research
01/06/2016

Incorporating Structural Alignment Biases into an Attentional Neural Translation Model

Neural encoder-decoder models of machine translation have achieved impre...
research
02/06/2020

Compositional Neural Machine Translation by Removing the Lexicon from Syntax

The meaning of a natural language utterance is largely determined from i...

Please sign up or login with your details

Forgot password? Click here to reset