Harvey Mudd College at SemEval-2019 Task 4: The Clint Buchanan Hyperpartisan News Detector

04/10/2019
by   Mehdi Drissi, et al.
0

We investigate the recently developed Bidirectional Encoder Representations from Transformers (BERT) model for the hyperpartisan news detection task. Using a subset of hand-labeled articles from SemEval as a validation set, we test the performance of different parameters for BERT models. We find that accuracy from two different BERT models using different proportions of the articles is consistently high, with our best-performing model on the validation set achieving 85 77 independent slices of the same article identically. Finally, we find that randomizing the order of word pieces dramatically reduces validation accuracy (to approximately 60 maintains an accuracy of about 80 from local context.

READ FULL TEXT
research
04/10/2022

Fake news detection using parallel BERT deep neural networks

Fake news is a growing challenge for social networks and media. Detectio...
research
09/30/2021

COVID-19 Fake News Detection Using Bidirectional Encoder Representations from Transformers Based Models

Nowadays, the development of social media allows people to access the la...
research
07/24/2020

IR-BERT: Leveraging BERT for Semantic Search in Background Linking for News Articles

This work describes our two approaches for the background linking task o...
research
05/17/2020

Context-Based Quotation Recommendation

While composing a new document, anything from a news article to an email...

Please sign up or login with your details

Forgot password? Click here to reset