Transfer Learning from Transformers to Fake News Challenge Stance Detection (FNC-1) Task

10/31/2019
by   Valeriya Slovikovskaya, et al.
0

In this paper, we report improved results of the Fake News Challenge Stage 1 (FNC-1) stance detection task. This gain in performance is due to the generalization power of large language models based on Transformer architecture, invented, trained and publicly released over the last two years. Specifically (1) we improved the FNC-1 best performing model adding BERT sentence embedding of input sequences as a model feature, (2) we fine-tuned BERT, XLNet, and RoBERTa transformers on FNC-1 extended dataset and obtained state-of-the-art results on FNC-1 task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/10/2022

Fake news detection using parallel BERT deep neural networks

Fake news is a growing challenge for social networks and media. Detectio...
research
07/22/2023

Identifying Misinformation on YouTube through Transcript Contextual Analysis with Transformer Models

Misinformation on YouTube is a significant concern, necessitating robust...
research
04/16/2023

MisRoBÆRTa: Transformers versus Misinformation

Misinformation is considered a threat to our democratic values and princ...
research
10/13/2021

Fake News Detection in Spanish Using Deep Learning Techniques

This paper addresses the problem of fake news detection in Spanish using...
research
01/09/2023

Transfer learning for conflict and duplicate detection in software requirement pairs

Consistent and holistic expression of software requirements is important...
research
08/09/2023

Performance Analysis of Transformer Based Models (BERT, ALBERT and RoBERTa) in Fake News Detection

Fake news is fake material in a news media format but is not processed p...
research
09/30/2021

COVID-19 Fake News Detection Using Bidirectional Encoder Representations from Transformers Based Models

Nowadays, the development of social media allows people to access the la...

Please sign up or login with your details

Forgot password? Click here to reset