DeepAI AI Chat
Log In Sign Up

Fine-tuning Pretrained Multilingual BERT Model for Indonesian Aspect-based Sentiment Analysis

by   Annisa Nurul Azhar, et al.

Although previous research on Aspect-based Sentiment Analysis (ABSA) for Indonesian reviews in hotel domain has been conducted using CNN and XGBoost, its model did not generalize well in test data and high number of OOV words contributed to misclassification cases. Nowadays, most state-of-the-art results for wide array of NLP tasks are achieved by utilizing pretrained language representation. In this paper, we intend to incorporate one of the foremost language representation model, BERT, to perform ABSA in Indonesian reviews dataset. By combining multilingual BERT (m-BERT) with task transformation method, we manage to achieve significant improvement by 8 compared to the result from our previous study.


page 1

page 2

page 3

page 4


Utilizing BERT Intermediate Layers for Aspect Based Sentiment Analysis and Natural Language Inference

Aspect based sentiment analysis aims to identify the sentimental tendenc...

Fine-Tuning BERT for Sentiment Analysis of Vietnamese Reviews

Sentiment analysis is an important task in the field ofNature Language P...

User Generated Data: Achilles' heel of BERT

Pre-trained language models such as BERT are known to perform exceedingl...

Context-Guided BERT for Targeted Aspect-Based Sentiment Analysis

Aspect-based sentiment analysis (ABSA) and Targeted ASBA (TABSA) allow f...

BERTaú: Itaú BERT for digital customer service

In the last few years, three major topics received increased interest: d...

Distilling BERT for low complexity network training

This paper studies the efficiency of transferring BERT learnings to low ...