Structured Training for Neural Network Transition-Based Parsing

06/19/2015
by   David Weiss, et al.
0

We present structured perceptron training for neural network transition-based dependency parsing. We learn the neural network representation using a gold corpus augmented by a large number of automatically parsed sentences. Given this fixed network representation, we learn a final layer using the structured perceptron with beam-search decoding. On the Penn Treebank, our parser reaches 94.26 is the best accuracy on Stanford Dependencies to date. We also provide in-depth ablative analysis to determine which aspects of our model provide the largest gains in accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2020

Transition-Based Dependency Parsing using Perceptron Learner

Syntactic parsing using dependency structures has become a standard tech...
research
03/02/2017

Lock-Free Parallel Perceptron for Graph-based Dependency Parsing

Dependency parsing is an important NLP task. A popular approach for depe...
research
11/28/2017

Hybrid Oracle: Making Use of Ambiguity in Transition-based Chinese Dependency Parsing

In the training of transition-based dependency parsers, an oracle is use...
research
04/22/2016

Dependency Parsing with LSTMs: An Empirical Evaluation

We propose a transition-based dependency parser using Recurrent Neural N...
research
05/09/2023

Structured Sentiment Analysis as Transition-based Dependency Parsing

Structured sentiment analysis (SSA) aims to automatically extract people...
research
03/19/2016

A Fast Unified Model for Parsing and Sentence Understanding

Tree-structured neural networks exploit valuable syntactic parse informa...
research
07/30/2021

Extracting Grammars from a Neural Network Parser for Anomaly Detection in Unknown Formats

Reinforcement learning has recently shown promise as a technique for tra...

Please sign up or login with your details

Forgot password? Click here to reset