Unsupervised Learning of Syntactic Structure with Invertible Neural Projections

08/28/2018
by   Junxian He, et al.
0

Unsupervised learning of syntactic structure is typically performed using generative models with discrete latent variables and multinomial parameters. In most cases, these models have not leveraged continuous word representations. In this work, we propose a novel generative model that jointly learns discrete syntactic structure and continuous word representations in an unsupervised fashion by cascading an invertible neural network with a structured generative prior. We show that the invertibility condition allows for efficient exact inference and marginal likelihood computation in our model so long as the prior is well-behaved. In experiments we instantiate our approach with both Markov and tree-structured priors, evaluating on two tasks: part-of-speech (POS) induction, and unsupervised dependency parsing without gold POS annotation. On the Penn Treebank, our Markov-structured model surpasses state-of-the-art results on POS induction. Similarly, we find that our tree-structured model achieves state-of-the-art performance on unsupervised dependency parsing for the difficult training condition where neither gold POS annotation nor punctuation-based constraints are available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2015

Word Representations, Tree Models and Syntactic Functions

Word representations induced from models with discrete latent variables ...
research
11/02/2017

Neural Language Modeling by Jointly Learning Syntax and Lexicon

We propose a neural language model capable of unsupervised syntactic str...
research
04/23/2017

A* CCG Parsing with a Supertag and Dependency Factored Model

We propose a new A* CCG parsing model in which the probability of a tree...
research
09/10/2021

Improved Latent Tree Induction with Distant Supervision via Span Constraints

For over thirty years, researchers have developed and analyzed methods f...
research
08/02/2017

Combining Generative and Discriminative Approaches to Unsupervised Dependency Parsing via Dual Decomposition

Unsupervised dependency parsing aims to learn a dependency parser from u...
research
10/28/2020

Second-Order Unsupervised Neural Dependency Parsing

Most of the unsupervised dependency parsers are based on first-order pro...
research
07/25/2018

Differentiable Perturb-and-Parse: Semi-Supervised Parsing with a Structured Variational Autoencoder

Human annotation for syntactic parsing is expensive, and large resources...

Please sign up or login with your details

Forgot password? Click here to reset