DeepAI
Log In Sign Up

Multilingual Irony Detection with Dependency Syntax and Neural Models

This paper presents an in-depth investigation of the effectiveness of dependency-based syntactic features on the irony detection task in a multilingual perspective (English, Spanish, French and Italian). It focuses on the contribution from syntactic knowledge, exploiting linguistic resources where syntax is annotated according to the Universal Dependencies scheme. Three distinct experimental settings are provided. In the first, a variety of syntactic dependency-based features combined with classical machine learning classifiers are explored. In the second scenario, two well-known types of word embeddings are trained on parsed data and tested against gold standard datasets. In the third setting, dependency-based syntactic features are combined into the Multilingual BERT architecture. The results suggest that fine-grained dependency-based syntactic information is informative for the detection of irony.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/28/2018

Syntactic Dependency Representations in Neural Relation Classification

We investigate the use of different syntactic dependency representations...
09/12/2020

Syntax Role for Neural Semantic Role Labeling

Semantic role labeling (SRL) is dedicated to recognizing the semantic pr...
04/19/2022

Multilingual Syntax-aware Language Modeling through Dependency Tree Conversion

Incorporating stronger syntactic biases into neural language models (LMs...
10/02/2020

Syntax Representation in Word Embeddings and Neural Networks – A Survey

Neural networks trained on natural language processing tasks capture syn...
12/05/2021

The Linear Arrangement Library. A new tool for research on syntactic dependency structures

The new and growing field of Quantitative Dependency Syntax has emerged ...
11/27/2019

Do Attention Heads in BERT Track Syntactic Dependencies?

We investigate the extent to which individual attention heads in pretrai...
04/30/2020

Universal Dependencies according to BERT: both more specific and more general

This work focuses on analyzing the form and extent of syntactic abstract...