Compositional Neural Machine Translation by Removing the Lexicon from Syntax

02/06/2020
by   Tristan Thrush, et al.
0

The meaning of a natural language utterance is largely determined from its syntax and words. Additionally, there is evidence that humans process an utterance by separating knowledge about the lexicon from syntax knowledge. Theories from semantics and neuroscience claim that complete word meanings are not encoded in the representation of syntax. In this paper, we propose neural units that can enforce this constraint over an LSTM encoder and decoder. We demonstrate that our model achieves competitive performance across a variety of domains including semantic parsing, syntactic parsing, and English to Mandarin Chinese translation. In these cases, our model outperforms the standard LSTM encoder and decoder architecture on many or all of our metrics. To demonstrate that our model achieves the desired separation between the lexicon and syntax, we analyze its weights and explore its behavior when different neural modules are damaged. When damaged, we find that the model displays the knowledge distortions that aphasics are evidenced to have.

READ FULL TEXT
research
07/18/2017

Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder

Most neural machine translation (NMT) models are based on the sequential...
research
07/18/2018

Semantic Parsing: Syntactic assurance to target sentence using LSTM Encoder CFG-Decoder

Semantic parsing can be defined as the process of mapping natural langua...
research
08/19/2019

Recurrent Graph Syntax Encoder for Neural Machine Translation

Syntax-incorporated machine translation models have been proven successf...
research
04/15/2017

Graph Convolutional Encoders for Syntax-aware Neural Machine Translation

We present a simple and effective approach to incorporating syntactic st...
research
05/28/2023

Neural Machine Translation with Dynamic Graph Convolutional Decoder

Existing wisdom demonstrates the significance of syntactic knowledge for...
research
04/24/2018

Scheduled Multi-Task Learning: From Syntax to Translation

Neural encoder-decoder models of machine translation have achieved impre...
research
05/20/2023

Learn to Compose Syntactic and Semantic Representations Appropriately for Compositional Generalization

Recent studies have shown that sequence-to-sequence (Seq2Seq) models are...

Please sign up or login with your details

Forgot password? Click here to reset