Incorporating Syntactic Uncertainty in Neural Machine Translation with Forest-to-Sequence Model

11/19/2017
by   Poorya Zaremoodi, et al.
0

Incorporating syntactic information in Neural Machine Translation models is a method to compensate their requirement for a large amount of parallel training text, especially for low-resource language pairs. Previous works on using syntactic information provided by (inevitably error-prone) parsers has been promising. In this paper, we propose a forest-to-sequence Attentional Neural Machine Translation model to make use of exponentially many parse trees of the source sentence to compensate for the parser errors. Our method represents the collection of parse trees as a packed forest, and learns a neural attentional transduction model from the forest to the target sentence. Experiments on English to German, Chinese and Persian translation show the superiority of our method over the tree-to-sequence and vanilla sequence-to-sequence neural translation models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/19/2016

Tree-to-Sequence Attentional Neural Machine Translation

Most of the existing Neural Machine Translation (NMT) models focus on th...
research
07/18/2017

Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder

Most neural machine translation (NMT) models are based on the sequential...
research
06/07/2021

Lexicon Learning for Few-Shot Neural Sequence Modeling

Sequence-to-sequence transduction is the core problem in language proces...
research
12/12/2018

Sentence-wise Smooth Regularization for Sequence to Sequence Learning

Maximum-likelihood estimation (MLE) is widely used in sequence to sequen...
research
10/17/2018

Sequence to Sequence Mixture Model for Diverse Machine Translation

Sequence to sequence (SEQ2SEQ) models often lack diversity in their gene...
research
10/06/2021

On Neurons Invariant to Sentence Structural Changes in Neural Machine Translation

To gain insight into the role neurons play, we study the activation patt...
research
01/04/2016

Mutual Information and Diverse Decoding Improve Neural Machine Translation

Sequence-to-sequence neural translation models learn semantic and syntac...

Please sign up or login with your details

Forgot password? Click here to reset