Multi-Task Bidirectional Transformer Representations for Irony Detection

09/08/2019
by   Chiyu Zhang, et al.
0

Supervised deep learning requires large amounts of training data. In the context of the FIRE2019 Arabic irony detection shared task (IDAT@FIRE2019), we show how we mitigate this need by fine-tuning the pre-trained bidirectional encoders from transformers (BERT) on gold data in a multi-task setting. We further improve our models by by further pre-training BERT on `in-domain' data, thus alleviating an issue of dialect mismatch in the Google-released BERT model. Our best model acquires 82.4 macro F1 score, and has the unique advantage of being feature-engineering free (i.e., based exclusively on deep learning).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/09/2019

BERT-Based Arabic Social Media AuthorProfiling

We report our models for detecting age, language variety, and gender fro...
research
04/06/2019

ThisIsCompetition at SemEval-2019 Task 9: BERT is unstable for out-of-domain samples

This paper describes our system, Joint Encoders for Stable Suggestion In...
research
02/07/2022

Universal Spam Detection using Transfer Learning of BERT Model

Deep learning transformer models become important by training on text da...
research
10/22/2022

Spectrum-BERT: Pre-training of Deep Bidirectional Transformers for Spectral Classification of Chinese Liquors

Spectral detection technology, as a non-invasive method for rapid detect...
research
09/09/2019

BERT-Based Arabic Social Media Author Profiling

We report our models for detecting age, language variety, and gender fro...
research
09/15/2021

Enhancing Clinical Information Extraction with Transferred Contextual Embeddings

The Bidirectional Encoder Representations from Transformers (BERT) model...
research
12/13/2020

Discriminative Pre-training for Low Resource Title Compression in Conversational Grocery

The ubiquity of smart voice assistants has made conversational shopping ...

Please sign up or login with your details

Forgot password? Click here to reset