NoPropaganda at SemEval-2020 Task 11: A Borrowed Approach to Sequence Tagging and Text Classification

07/25/2020
by   Ilya Dimov, et al.
0

This paper describes our contribution to SemEval-2020 Task 11: Detection Of Propaganda Techniques In News Articles. We start with simple LSTM baselines and move to an autoregressive transformer decoder to predict long continuous propaganda spans for the first subtask. We also adopt an approach from relation extraction by enveloping spans mentioned above with special tokens for the second subtask of propaganda technique classification. Our models report an F-score of 44.6 accordingly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2018

Scientific Relation Extraction with Selectively Incorporated Concept Embeddings

This paper describes our submission for the SemEval 2018 Task 7 shared t...
research
06/30/2023

GPT-FinRE: In-context Learning for Financial Relation Extraction using Large Language Models

Relation extraction (RE) is a crucial task in natural language processin...
research
09/16/2020

Solomon at SemEval-2020 Task 11: Ensemble Architecture for Fine-Tuned Propaganda Detection in News Articles

This paper describes our system (Solomon) details and results of partici...
research
07/21/2020

newsSweeper at SemEval-2020 Task 11: Context-Aware Rich Feature Representations For Propaganda Classification

This paper describes our submissions to SemEval 2020 Task 11: Detection ...
research
05/24/2022

EdiT5: Semi-Autoregressive Text-Editing with T5 Warm-Start

We present EdiT5 - a novel semi-autoregressive text-editing approach des...
research
09/08/2020

Revisiting LSTM Networks for Semi-Supervised Text Classification via Mixed Objective Function

In this paper, we study bidirectional LSTM network for the task of text ...
research
08/17/2017

Simple Open Stance Classification for Rumour Analysis

Stance classification determines the attitude, or stance, in a (typicall...

Please sign up or login with your details

Forgot password? Click here to reset