Unsupervised Dependency Parsing: Let's Use Supervised Parsers

04/18/2015
by   Phong Le, et al.
0

We present a self-training approach to unsupervised dependency parsing that reuses existing supervised and unsupervised parsing algorithms. Our approach, called `iterated reranking' (IR), starts with dependency trees generated by an unsupervised parser, and iteratively improves these trees using the richer probability models used in supervised parsing that are in turn trained on these trees. Our system achieves 1.8 parser of Spitkovsky et al. (2013) on the WSJ corpus.

READ FULL TEXT

page 8

page 9

research
10/04/2020

A Survey of Unsupervised Dependency Parsing

Syntactic dependency parsing is an important task in natural language pr...
research
01/12/2022

Biaffine Discourse Dependency Parsing

We provide a study of using the biaffine model for neural discourse depe...
research
02/27/2015

Parsing as Reduction

We reduce phrase-representation parsing to dependency parsing. Our reduc...
research
02/13/2019

Leveraging Newswire Treebanks for Parsing Conversational Data with Argument Scrambling

We investigate the problem of parsing conversational data of morphologic...
research
10/19/2020

Heads-up! Unsupervised Constituency Parsing via Self-Attention Heads

Transformer-based pre-trained language models (PLMs) have dramatically i...
research
10/06/2020

On the Role of Supervision in Unsupervised Constituency Parsing

We analyze several recent unsupervised constituency parsing models, whic...
research
03/24/2022

Probing for Labeled Dependency Trees

Probing has become an important tool for analyzing representations in Na...

Please sign up or login with your details

Forgot password? Click here to reset