Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training

10/10/2020
by   Xinyu Wang, et al.
0

In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our approaches match the accuracy of very recent state-of-the-art second-order graph-based neural dependency parsers and have significantly faster speed in both training and testing. We also empirically show the advantage of second-order parsing over first-order parsing and observe that the usefulness of the head-selection structured constraint vanishes when using BERT embedding.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2019

Second-Order Semantic Dependency Parsing with End-to-End Neural Networks

Semantic dependency parsing aims to identify semantic relationships betw...
research
10/28/2020

Second-Order Unsupervised Neural Dependency Parsing

Most of the unsupervised dependency parsers are based on first-order pro...
research
03/02/2017

Lock-Free Parallel Perceptron for Graph-based Dependency Parsing

Dependency parsing is an important NLP task. A popular approach for depe...
research
05/01/2017

Dependency Parsing with Dilated Iterated Graph CNNs

Dependency parses are an effective way to inject linguistic knowledge in...
research
06/02/2020

Enhanced Universal Dependency Parsing with Second-Order Inference and Mixture of Training Data

This paper presents the system used in our submission to the IWPT 2020 S...
research
02/25/2015

Web-scale Surface and Syntactic n-gram Features for Dependency Parsing

We develop novel first- and second-order features for dependency parsing...
research
04/07/2022

Modeling Label Correlations for Second-Order Semantic Dependency Parsing with Mean-Field Inference

Second-order semantic parsing with end-to-end mean-field inference has b...

Please sign up or login with your details

Forgot password? Click here to reset