What Do Recurrent Neural Network Grammars Learn About Syntax?

11/17/2016
by   Adhiguna Kuncoro, et al.
0

Recurrent neural network grammars (RNNG) are a recently proposed probabilistic generative modeling family for natural language. They show state-of-the-art language modeling and parsing performance. We investigate what information they learn, from a linguistic perspective, through various ablations to the model and the data, and by augmenting the model with an attention mechanism (GA-RNNG) to enable closer inspection. We find that explicit modeling of composition is crucial for achieving the best performance. Through the attention mechanism, we find that headedness plays a central role in phrasal representation (with the model's latent attention largely agreeing with predictions made by hand-crafted head rules, albeit with some important differences). By training grammars without nonterminal labels, we find that phrasal representations depend minimally on nonterminals, providing support for the endocentricity hypothesis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2016

Recurrent Neural Network Grammars

We introduce recurrent neural network grammars, probabilistic models of ...
research
10/24/2022

Composition, Attention, or Both?

In this paper, we propose a novel architecture called Composition Attent...
research
04/07/2019

Unsupervised Recurrent Neural Network Grammars

Recurrent neural network grammars (RNNG) are generative models of langua...
research
09/20/2020

Repulsive Attention: Rethinking Multi-head Attention as Bayesian Inference

The neural attention mechanism plays an important role in many natural l...
research
04/20/2018

What's Going On in Neural Constituency Parsers? An Analysis

A number of differences have emerged between modern and classic approach...
research
06/11/2018

Let's do it "again": A First Computational Approach to Detecting Adverbial Presupposition Triggers

We introduce the task of predicting adverbial presupposition triggers su...
research
12/19/2022

Uncovering the Origins of Instability in Dynamical Systems: How Attention Mechanism Can Help?

The behavior of the network and its stability are governed by both dynam...

Please sign up or login with your details

Forgot password? Click here to reset