Nondeterministic Stacks in Neural Networks

04/25/2023
by   Brian DuSell, et al.
0

Human language is full of compositional syntactic structures, and although neural networks have contributed to groundbreaking improvements in computer systems that process language, widely-used neural network architectures still exhibit limitations in their ability to process syntax. To address this issue, prior work has proposed adding stack data structures to neural networks, drawing inspiration from theoretical connections between syntax and stacks. However, these methods employ deterministic stacks that are designed to track one parse at a time, whereas syntactic ambiguity, which requires a nondeterministic stack to parse, is extremely common in language. In this dissertation, we remedy this discrepancy by proposing a method of incorporating nondeterministic stacks into neural networks. We develop a differentiable data structure that efficiently simulates a nondeterministic pushdown automaton, representing an exponential number of computations with a dynamic programming algorithm. We incorporate this module into two predominant architectures: recurrent neural networks (RNNs) and transformers. We show that this raises their formal recognition power to arbitrary context-free languages, and also aids training, even on deterministic context-free languages. Empirically, neural networks with nondeterministic stacks learn context-free languages much more effectively than prior stack-augmented models, including a language with theoretically maximal parsing difficulty. We also show that an RNN augmented with a nondeterminsitic stack is capable of surprisingly powerful behavior, such as learning cross-serial dependencies, a well-known non-context-free pattern. We demonstrate improvements on natural language modeling and provide analysis on a syntactic generalization benchmark. This work represents an important step toward building systems that learn to use syntax in more human-like fashion.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/04/2022

The Surprising Computational Power of Nondeterministic Stack RNNs

Traditional recurrent neural networks (RNNs) have a fixed, finite number...
research
10/09/2020

Learning Context-Free Languages with Nondeterministic Stack RNNs

We present a differentiable stack data structure that simultaneously and...
research
09/05/2021

Learning Hierarchical Structures with Differentiable Nondeterministic Stacks

Learning hierarchical structures in sequential data – from simple algori...
research
06/04/2019

Finding Syntactic Representations in Neural Stacks

Neural network architectures have been augmented with differentiable sta...
research
10/15/2020

RNNs can generate bounded hierarchical languages with optimal memory

Recurrent neural networks empirically generate natural language with hig...
research
09/11/2018

Limitations in learning an interpreted language with recurrent models

In this submission I report work in progress on learning simplified inte...
research
06/27/2022

Center-Embedding and Constituency in the Brain and a New Characterization of Context-Free Languages

A computational system implemented exclusively through the spiking of ne...

Please sign up or login with your details

Forgot password? Click here to reset