DeepAI AI Chat
Log In Sign Up

Assessing the Unitary RNN as an End-to-End Compositional Model of Syntax

08/11/2022
by   Jean-Philippe Bernardy, et al.
0

We show that both an LSTM and a unitary-evolution recurrent neural network (URN) can achieve encouraging accuracy on two types of syntactic patterns: context-free long distance agreement, and mildly context-sensitive cross serial dependencies. This work extends recent experiments on deeply nested context-free long distance dependencies, with similar results. URNs differ from LSTMs in that they avoid non-linear activation functions, and they apply matrix multiplication to word embeddings encoded as unitary matrices. This permits them to retain all information in the processing of an input string over arbitrary distances. It also causes them to satisfy strict compositionality. URNs constitute a significant advance in the search for explainable models in deep learning applied to NLP.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/04/2016

Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies

The success of long short-term memory (LSTM) neural networks in language...
03/18/2019

The emergence of number and syntax units in LSTM language models

Recent work has shown that LSTMs trained on a generic language modeling ...
04/30/2020

Attribution Analysis of Grammatical Dependencies in LSTMs

LSTM language models have been shown to capture syntax-sensitive grammat...
06/19/2020

Exploring Processing of Nested Dependencies in Neural-Network Language Models and Humans

Recursive processing in sentence comprehension is considered a hallmark ...
03/03/2019

Structural Supervision Improves Learning of Non-Local Grammatical Dependencies

State-of-the-art LSTM language models trained on large corpora learn seq...
11/09/2017

The Lifted Matrix-Space Model for Semantic Composition

Recent advances in tree structured sentence encoding models have shown t...