Learning Hierarchical Structures with Differentiable Nondeterministic Stacks

09/05/2021
by   Brian DuSell, et al.
0

Learning hierarchical structures in sequential data – from simple algorithmic patterns to natural language – in a reliable, generalizable way remains a challenging problem for neural language models. Past work has shown that recurrent neural networks (RNNs) struggle to generalize on held-out algorithmic or syntactic patterns without supervision or some inductive bias. To remedy this, many papers have explored augmenting RNNs with various differentiable stacks, by analogy with finite automata and pushdown automata. In this paper, we present a stack RNN model based on the recently proposed Nondeterministic Stack RNN (NS-RNN) that achieves lower cross-entropy than all previous stack RNNs on five context-free language modeling tasks (within 0.05 nats of the information-theoretic lower bound), including a task in which the NS-RNN previously failed to outperform a deterministic stack RNN baseline. Our model assigns arbitrary positive weights instead of probabilities to stack actions, and we provide an analysis of why this improves training. We also propose a restricted version of the NS-RNN that makes it practical to use for language modeling on natural language and present results on the Penn Treebank corpus.

READ FULL TEXT
research
10/09/2020

Learning Context-Free Languages with Nondeterministic Stack RNNs

We present a differentiable stack data structure that simultaneously and...
research
06/04/2019

Finding Syntactic Representations in Neural Stacks

Neural network architectures have been augmented with differentiable sta...
research
05/15/2018

SoPa: Bridging CNNs, RNNs, and Weighted Finite-State Machines

Recurrent and convolutional neural networks comprise two distinct famili...
research
04/25/2023

Nondeterministic Stacks in Neural Networks

Human language is full of compositional syntactic structures, and althou...
research
06/05/2020

Provably Stable Interpretable Encodings of Context Free Grammars in RNNs with a Differentiable Stack

Given a collection of strings belonging to a context free grammar (CFG) ...
research
11/13/2018

Modeling Local Dependence in Natural Language with Multi-channel Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have been widely used in processing nat...
research
10/15/2020

RNNs can generate bounded hierarchical languages with optimal memory

Recurrent neural networks empirically generate natural language with hig...

Please sign up or login with your details

Forgot password? Click here to reset