Fine-tuning Pretrained Multilingual BERT Model for Indonesian Aspect-based Sentiment Analysis

03/05/2021
by   Annisa Nurul Azhar, et al.
0

Although previous research on Aspect-based Sentiment Analysis (ABSA) for Indonesian reviews in hotel domain has been conducted using CNN and XGBoost, its model did not generalize well in test data and high number of OOV words contributed to misclassification cases. Nowadays, most state-of-the-art results for wide array of NLP tasks are achieved by utilizing pretrained language representation. In this paper, we intend to incorporate one of the foremost language representation model, BERT, to perform ABSA in Indonesian reviews dataset. By combining multilingual BERT (m-BERT) with task transformation method, we manage to achieve significant improvement by 8 compared to the result from our previous study.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2020

Utilizing BERT Intermediate Layers for Aspect Based Sentiment Analysis and Natural Language Inference

Aspect based sentiment analysis aims to identify the sentimental tendenc...
research
11/20/2020

Fine-Tuning BERT for Sentiment Analysis of Vietnamese Reviews

Sentiment analysis is an important task in the field ofNature Language P...
research
03/29/2020

User Generated Data: Achilles' heel of BERT

Pre-trained language models such as BERT are known to perform exceedingl...
research
08/30/2019

Adapt or Get Left Behind: Domain Adaptation through BERT Language Model Finetuning for Aspect-Target Sentiment Classification

Aspect-Target Sentiment Classification (ATSC) is a subtask of Aspect-Bas...
research
10/15/2020

Context-Guided BERT for Targeted Aspect-Based Sentiment Analysis

Aspect-based sentiment analysis (ABSA) and Targeted ASBA (TABSA) allow f...
research
01/28/2021

BERTaú: Itaú BERT for digital customer service

In the last few years, three major topics received increased interest: d...
research
05/13/2021

Distilling BERT for low complexity network training

This paper studies the efficiency of transferring BERT learnings to low ...

Please sign up or login with your details

Forgot password? Click here to reset