Improving a Strong Neural Parser with Conjunction-Specific Features

02/22/2017
by   Jessica Ficler, et al.
0

While dependency parsers reach very high overall accuracy, some dependency relations are much harder than others. In particular, dependency parsers perform poorly in coordination construction (i.e., correctly attaching the "conj" relation). We extend a state-of-the-art dependency parser with conjunction-specific features, focusing on the similarity between the conjuncts head words. Training the extended parser yields an improvement in "conj" attachment as well as in overall dependency parsing accuracy on the Stanford dependency conversion of the Penn TreeBank.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/15/2020

Fast semantic parsing with well-typedness guarantees

AM dependency parsing is a linguistically principled method for neural s...
research
03/10/2020

Hierarchical Human Parsing with Typed Part-Relation Reasoning

Human parsing is for pixel-wise human semantic understanding. As human b...
research
03/23/2015

Yara Parser: A Fast and Accurate Dependency Parser

Dependency parsers are among the most crucial tools in natural language ...
research
01/28/2021

Syntactic Nuclei in Dependency Parsing – A Multilingual Exploration

Standard models for syntactic dependency parsing take words to be the el...
research
10/18/2018

Reduction of Parameter Redundancy in Biaffine Classifiers with Symmetric and Circulant Weight Matrices

Currently, the biaffine classifier has been attracting attention as a me...
research
08/10/2015

Approximation-Aware Dependency Parsing by Belief Propagation

We show how to train the fast dependency parser of Smith and Eisner (200...
research
06/08/2023

Hexatagging: Projective Dependency Parsing as Tagging

We introduce a novel dependency parser, the hexatagger, that constructs ...

Please sign up or login with your details

Forgot password? Click here to reset