Recurrent Graph Syntax Encoder for Neural Machine Translation

08/19/2019
by   Liang Ding, et al.
0

Syntax-incorporated machine translation models have been proven successful in improving the model's reasoning and meaning preservation ability. In this paper, we propose a simple yet effective graph-structured encoder, the Recurrent Graph Syntax Encoder, dubbed RGSE, which enhances the ability to capture useful syntactic information. The RGSE is done over a standard encoder (recurrent or self-attention encoder), regarding recurrent network units as graph nodes and injects syntactic dependencies as edges, such that RGSE models syntactic dependencies and sequential information (i.e., word order) simultaneously. Our approach achieves considerable improvements over several syntax-aware NMT models in EnglishGerman and EnglishCzech translation tasks. And RGSE-equipped big model obtains competitive result compared with the state-of-the-art model in WMT14 En-De task. Extensive analysis further verifies that RGSE could benefit long sentence modeling, and produces better translations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2018

Top-down Tree Structured Decoding with Syntactic Connections for Neural Machine Translation and Parsing

The addition of syntax-aware decoding in Neural Machine Translation (NMT...
research
07/18/2017

Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder

Most neural machine translation (NMT) models are based on the sequential...
research
04/15/2017

Graph Convolutional Encoders for Syntax-aware Neural Machine Translation

We present a simple and effective approach to incorporating syntactic st...
research
02/06/2020

Compositional Neural Machine Translation by Removing the Lexicon from Syntax

The meaning of a natural language utterance is largely determined from i...
research
07/08/2019

An Intrinsic Nearest Neighbor Analysis of Neural Machine Translation Architectures

Earlier approaches indirectly studied the information captured by the hi...
research
10/24/2019

Promoting the Knowledge of Source Syntax in Transformer NMT Is Not Needed

The utility of linguistic annotation in neural machine translation seeme...
research
05/23/2022

Neural Subgraph Explorer: Reducing Noisy Information via Target-Oriented Syntax Graph Pruning

Recent years have witnessed the emerging success of leveraging syntax gr...

Please sign up or login with your details

Forgot password? Click here to reset