Syntactic Dependency Representations in Neural Relation Classification

05/28/2018
by   Farhad Nooralahzadeh, et al.
0

We investigate the use of different syntactic dependency representations in a neural relation classification task and compare the CoNLL, Stanford Basic and Universal Dependencies schemes. We further compare with a syntax-agnostic approach and perform an error analysis in order to gain a better understanding of the results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/13/2022

Quantifying syntax similarity with a polynomial representation of dependency trees

We introduce a graph polynomial that distinguishes tree structures to re...
research
11/11/2020

Multilingual Irony Detection with Dependency Syntax and Neural Models

This paper presents an in-depth investigation of the effectiveness of de...
research
10/21/2015

Prevalence and recoverability of syntactic parameters in sparse distributed memories

We propose a new method, based on Sparse Distributed Memory (Kanerva Net...
research
04/30/2020

Representations of Syntax [MASK] Useful: Effects of Constituency and Dependency Structure in Recursive LSTMs

Sequence-based neural networks show significant sensitivity to syntactic...
research
12/05/2021

The Linear Arrangement Library. A new tool for research on syntactic dependency structures

The new and growing field of Quantitative Dependency Syntax has emerged ...
research
04/30/2020

Universal Dependencies according to BERT: both more specific and more general

This work focuses on analyzing the form and extent of syntactic abstract...
research
02/22/2020

Exploiting Typed Syntactic Dependencies for Targeted Sentiment Classification Using Graph Attention Neural Network

Targeted sentiment classification predicts the sentiment polarity on giv...

Please sign up or login with your details

Forgot password? Click here to reset