Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations

05/08/2019
by   Meishan Zhang, et al.
0

Syntax has been demonstrated highly effective in neural machine translation (NMT). Previous NMT models integrate syntax by representing 1-best tree outputs from a well-trained parsing system, e.g., the representative Tree-RNN and Tree-Linearization methods, which may suffer from error propagation. In this work, we propose a novel method to integrate source-side syntax implicitly for NMT. The basic idea is to use the intermediate hidden representations of a well-trained end-to-end dependency parser, which are referred to as syntax-aware word representations (SAWRs). Then, we simply concatenate such SAWRs with ordinary word embeddings to enhance basic NMT models. The method can be straightforwardly integrated into the widely-used sequence-to-sequence (Seq2Seq) NMT models. We start with a representative RNN-based Seq2Seq baseline system, and test the effectiveness of our proposed method on two benchmark datasets of the Chinese-English and English-Vietnamese translation tasks, respectively. Experimental results show that the proposed approach is able to bring significant BLEU score improvements on the two datasets compared with the baseline, 1.74 points for Chinese-English translation and 0.80 point for English-Vietnamese translation, respectively. In addition, the approach also outperforms the explicit Tree-RNN and Tree-Linearization methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/16/2017

Towards String-to-Tree Neural Machine Translation

We present a simple method to incorporate syntactic information about th...
research
05/02/2017

Modeling Source Syntax for Neural Machine Translation

Even though a linguistics-free sequence to sequence model in neural mach...
research
09/06/2018

Top-down Tree Structured Decoding with Syntactic Connections for Neural Machine Translation and Parsing

The addition of syntax-aware decoding in Neural Machine Translation (NMT...
research
07/14/2020

Modeling Voting for System Combination in Machine Translation

System combination is an important technique for combining the hypothese...
research
05/17/2020

Encodings of Source Syntax: Similarities in NMT Representations Across Target Languages

We train neural machine translation (NMT) models from English to six tar...
research
05/01/2018

Multi-representation Ensembles and Delayed SGD Updates Improve Syntax-based NMT

We explore strategies for incorporating target syntax into Neural Machin...
research
08/28/2018

A Tree-based Decoder for Neural Machine Translation

Recent advances in Neural Machine Translation (NMT) show that adding syn...

Please sign up or login with your details

Forgot password? Click here to reset