Unsupervised Recurrent Neural Network Grammars

04/07/2019
by   Yoon Kim, et al.
0

Recurrent neural network grammars (RNNG) are generative models of language which jointly model syntax and surface structure by incrementally generating a syntax tree and sentence in a top-down, left-to-right order. Supervised RNNGs achieve strong language modeling and parsing performance, but require an annotated corpus of parse trees. In this work, we experiment with unsupervised learning of RNNGs. Since directly marginalizing over the space of latent trees is intractable, we instead apply amortized variational inference. To maximize the evidence lower bound, we develop an inference network parameterized as a neural CRF constituency parser. On language modeling, unsupervised RNNGs perform as well their supervised counterparts on benchmarks in English and Chinese. On constituency grammar induction, they are competitive with recent neural language models that induce tree structures from words through attention mechanisms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2016

Recurrent Neural Network Grammars

We introduce recurrent neural network grammars, probabilistic models of ...
research
08/29/2018

Grammar Induction with Neural Language Models: An Unusual Replication

A substantial thread of recent work on latent tree learning has attempte...
research
04/01/2016

A Compositional Approach to Language Modeling

Traditional language models treat language as a finite state automaton o...
research
08/01/2017

A Generative Parser with a Discriminative Recognition Algorithm

Generative models defining joint distributions over parse trees and sent...
research
11/17/2016

What Do Recurrent Neural Network Grammars Learn About Syntax?

Recurrent neural network grammars (RNNG) are a recently proposed probabi...
research
09/04/2019

PaLM: A Hybrid Parser and Language Model

We present PaLM, a hybrid parser and neural language model. Building on ...
research
10/07/2018

Unsupervised Neural Word Segmentation for Chinese via Segmental Language Modeling

Previous traditional approaches to unsupervised Chinese word segmentatio...

Please sign up or login with your details

Forgot password? Click here to reset