Finding Syntactic Representations in Neural Stacks

06/04/2019
by   William Merrill, et al.
0

Neural network architectures have been augmented with differentiable stacks in order to introduce a bias toward learning hierarchy-sensitive regularities. It has, however, proven difficult to assess the degree to which such a bias is effective, as the operation of the differentiable stack is not always interpretable. In this paper, we attempt to detect the presence of latent representations of hierarchical structure through an exploration of the unsupervised learning of constituency structure. Using a technique due to Shen et al. (2018a,b), we extract syntactic trees from the pushing behavior of stack RNNs trained on language modeling and classification objectives. We find that our models produce parses that reflect natural language syntactic constituencies, demonstrating that stack RNNs do indeed infer linguistically relevant hierarchical structure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2021

Learning Hierarchical Structures with Differentiable Nondeterministic Stacks

Learning hierarchical structures in sequential data – from simple algori...
research
09/08/2018

Context-Free Transductions with Neural Stacks

This paper analyzes the behavior of stack-augmented recurrent neural net...
research
04/25/2023

Nondeterministic Stacks in Neural Networks

Human language is full of compositional syntactic structures, and althou...
research
10/09/2020

Learning Context-Free Languages with Nondeterministic Stack RNNs

We present a differentiable stack data structure that simultaneously and...
research
09/20/2019

A Critical Analysis of Biased Parsers in Unsupervised Parsing

A series of recent papers has used a parsing algorithm due to Shen et al...
research
06/05/2020

Provably Stable Interpretable Encodings of Context Free Grammars in RNNs with a Differentiable Stack

Given a collection of strings belonging to a context free grammar (CFG) ...

Please sign up or login with your details

Forgot password? Click here to reset