What do RNN Language Models Learn about Filler-Gap Dependencies?

08/31/2018
by   Ethan Wilcox, et al.
0

RNN language models have achieved state-of-the-art perplexity results and have proven useful in a suite of NLP tasks, but it is as yet unclear what syntactic generalizations they learn. Here we investigate whether state-of-the-art RNN language models represent long-distance filler-gap dependencies and constraints on them. Examining RNN behavior on experimentally controlled sentences designed to expose filler-gap dependencies, we show that RNNs can represent the relationship in multiple syntactic positions and over large spans of text. Furthermore, we show that RNNs learn a subset of the known restrictions on filler-gap dependencies, known as island constraints: RNNs show evidence for wh-islands, adjunct islands, and complex NP islands. These studies demonstrates that state-of-the-art RNN models are able to learn and generalize about empty syntactic positions.

READ FULL TEXT
research
05/24/2019

What Syntactic Structures block Dependencies in RNN Language Models?

Recurrent Neural Networks (RNNs) trained on a language modeling task hav...
research
11/05/2018

Do RNNs learn human-like abstract word order preferences?

RNN language models have achieved state-of-the-art results on various ta...
research
09/05/2018

RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency

Recurrent neural networks (RNNs) are the state of the art in sequence mo...
research
06/12/2017

Exploring the Syntactic Abilities of RNNs with Multi-task Learning

Recent work has explored the syntactic abilities of RNNs using the subje...
research
09/10/2019

Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study

Neural language models have achieved state-of-the-art performances on ma...
research
11/05/2016

TopicRNN: A Recurrent Neural Network with Long-Range Semantic Dependency

In this paper, we propose TopicRNN, a recurrent neural network (RNN)-bas...
research
10/21/2022

Do Vision-and-Language Transformers Learn Grounded Predicate-Noun Dependencies?

Recent advances in vision-and-language modeling have seen the developmen...

Please sign up or login with your details

Forgot password? Click here to reset