Is It Really Useful to Jointly Parse Constituency and Dependency Trees? A Revisit

09/21/2023
by   Yanggang Gu, et al.
0

This work visits the topic of jointly parsing constituency and dependency trees, i.e., to produce compatible constituency and dependency trees simultaneously for input sentences, which is attractive considering that the two types of trees are complementary in representing syntax. Compared with previous works, we make progress in four aspects: (1) adopting a much more efficient decoding algorithm, (2) exploring joint modeling at the training phase, instead of only at the inference phase, (3) proposing high-order scoring components for constituent-dependency interaction, (4) gaining more insights via in-depth experiments and analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/14/2015

Probabilistic Models for High-Order Projective Dependency Parsing

This paper presents generalized probabilistic models for high-order proj...
research
02/27/2019

Viable Dependency Parsing as Sequence Labeling

We recast dependency parsing as a sequence labeling problem, exploring s...
research
10/06/2020

Please Mind the Root: Decoding Arborescences for Dependency Parsing

The connection between dependency trees and spanning trees is exploited ...
research
03/04/2016

Getting More Out Of Syntax with PropS

Semantic NLP applications often rely on dependency trees to recognize ma...
research
05/25/2022

Unbiased and Efficient Sampling of Dependency Trees

Distributions over spanning trees are the most common way of computation...
research
10/16/2021

Back to Reality: Leveraging Pattern-driven Modeling to Enable Affordable Sentiment Dependency Learning

Aspect-based Sentiment Classification (ABSC) is a challenging sub-task o...
research
11/30/2021

Minor changes make a difference: a case study on the consistency of UD-based dependency parsers

Many downstream applications are using dependency trees, and are thus re...

Please sign up or login with your details

Forgot password? Click here to reset