Transformer-based approaches to Sentiment Detection

03/13/2023
by   Olumide Ebenezer Ojo, et al.
0

The use of transfer learning methods is largely responsible for the present breakthrough in Natural Learning Processing (NLP) tasks across multiple domains. In order to solve the problem of sentiment detection, we examined the performance of four different types of well-known state-of-the-art transformer models for text classification. Models such as Bidirectional Encoder Representations from Transformers (BERT), Robustly Optimized BERT Pre-training Approach (RoBERTa), a distilled version of BERT (DistilBERT), and a large bidirectional neural network architecture (XLNet) were proposed. The performance of the four models that were used to detect disaster in the text was compared. All the models performed well enough, indicating that transformer-based models are suitable for the detection of disaster in text. The RoBERTa transformer model performs best on the test dataset with a score of 82.6 discovered that the learning algorithms' performance was influenced by the pre-processing techniques, the nature of words in the vocabulary, unbalanced labeling, and the model parameters.

READ FULL TEXT
research
05/26/2020

ParsBERT: Transformer-based Model for Persian Language Understanding

The surge of pre-trained language models has begun a new era in the fiel...
research
05/07/2023

Stanford MLab at SemEval-2023 Task 10: Exploring GloVe- and Transformer-Based Methods for the Explainable Detection of Online Sexism

In this paper, we discuss the methods we applied at SemEval-2023 Task 10...
research
08/06/2021

Offensive Language and Hate Speech Detection with Deep Learning and Transfer Learning

Toxic online speech has become a crucial problem nowadays due to an expo...
research
02/16/2020

The Utility of General Domain Transfer Learning for Medical Language Tasks

The purpose of this study is to analyze the efficacy of transfer learnin...
research
02/07/2022

Universal Spam Detection using Transfer Learning of BERT Model

Deep learning transformer models become important by training on text da...
research
02/15/2022

BLUE at Memotion 2.0 2022: You have my Image, my Text and my Transformer

Memes are prevalent on the internet and continue to grow and evolve alon...
research
09/28/2021

How Different Text-preprocessing Techniques Using The BERT Model Affect The Gender Profiling of Authors

Forensic author profiling plays an important role in indicating possible...

Please sign up or login with your details

Forgot password? Click here to reset