DeepAI AI Chat
Log In Sign Up

BERT for Sentiment Analysis: Pre-trained and Fine-Tuned Alternatives

by   Frederico Souza, et al.

BERT has revolutionized the NLP field by enabling transfer learning with large language models that can capture complex textual patterns, reaching the state-of-the-art for an expressive number of NLP applications. For text classification tasks, BERT has already been extensively explored. However, aspects like how to better cope with the different embeddings provided by the BERT output layer and the usage of language-specific instead of multilingual models are not well studied in the literature, especially for the Brazilian Portuguese language. The purpose of this article is to conduct an extensive experimental study regarding different strategies for aggregating the features produced in the BERT output layer, with a focus on the sentiment analysis task. The experiments include BERT models trained with Brazilian Portuguese corpora and the multilingual version, contemplating multiple aggregation strategies and open-source datasets with predefined training, validation, and test partitions to facilitate the reproducibility of the results. BERT achieved the highest ROC-AUC values for the majority of cases as compared to TF-IDF. Nonetheless, TF-IDF represents a good trade-off between the predictive performance and computational cost.


Evaluating Multilingual BERT for Estonian

Recently, large pre-trained language models, such as BERT, have reached ...

Lessons Learned from Applying off-the-shelf BERT: There is no Silver Bullet

One of the challenges in the NLP field is training large classification ...

Distilling BERT for low complexity network training

This paper studies the efficiency of transferring BERT learnings to low ...

User Generated Data: Achilles' heel of BERT

Pre-trained language models such as BERT are known to perform exceedingl...

Cost-Sensitive BERT for Generalisable Sentence Classification with Imbalanced Data

The automatic identification of propaganda has gained significance in re...

Classifying Textual Data with Pre-trained Vision Models through Transfer Learning and Data Transformations

Knowledge is acquired by humans through experience, and no boundary is s...