Assessing the Unitary RNN as an End-to-End Compositional Model of Syntax

08/11/2022
by   Jean-Philippe Bernardy, et al.
0

We show that both an LSTM and a unitary-evolution recurrent neural network (URN) can achieve encouraging accuracy on two types of syntactic patterns: context-free long distance agreement, and mildly context-sensitive cross serial dependencies. This work extends recent experiments on deeply nested context-free long distance dependencies, with similar results. URNs differ from LSTMs in that they avoid non-linear activation functions, and they apply matrix multiplication to word embeddings encoded as unitary matrices. This permits them to retain all information in the processing of an input string over arbitrary distances. It also causes them to satisfy strict compositionality. URNs constitute a significant advance in the search for explainable models in deep learning applied to NLP.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/04/2016

Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies

The success of long short-term memory (LSTM) neural networks in language...
research
03/18/2019

The emergence of number and syntax units in LSTM language models

Recent work has shown that LSTMs trained on a generic language modeling ...
research
04/30/2020

Attribution Analysis of Grammatical Dependencies in LSTMs

LSTM language models have been shown to capture syntax-sensitive grammat...
research
06/19/2020

Exploring Processing of Nested Dependencies in Neural-Network Language Models and Humans

Recursive processing in sentence comprehension is considered a hallmark ...
research
11/05/2016

TopicRNN: A Recurrent Neural Network with Long-Range Semantic Dependency

In this paper, we propose TopicRNN, a recurrent neural network (RNN)-bas...
research
05/17/2020

How much complexity does an RNN architecture need to learn syntax-sensitive dependencies?

Long short-term memory (LSTM) networks and their variants are capable of...
research
11/09/2017

The Lifted Matrix-Space Model for Semantic Composition

Recent advances in tree structured sentence encoding models have shown t...

Please sign up or login with your details

Forgot password? Click here to reset