DeepAI AI Chat
Log In Sign Up

Fine-tuning Pretrained Multilingual BERT Model for Indonesian Aspect-based Sentiment Analysis

03/05/2021
by   Annisa Nurul Azhar, et al.
0

Although previous research on Aspect-based Sentiment Analysis (ABSA) for Indonesian reviews in hotel domain has been conducted using CNN and XGBoost, its model did not generalize well in test data and high number of OOV words contributed to misclassification cases. Nowadays, most state-of-the-art results for wide array of NLP tasks are achieved by utilizing pretrained language representation. In this paper, we intend to incorporate one of the foremost language representation model, BERT, to perform ABSA in Indonesian reviews dataset. By combining multilingual BERT (m-BERT) with task transformation method, we manage to achieve significant improvement by 8 compared to the result from our previous study.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/12/2020

Utilizing BERT Intermediate Layers for Aspect Based Sentiment Analysis and Natural Language Inference

Aspect based sentiment analysis aims to identify the sentimental tendenc...
11/20/2020

Fine-Tuning BERT for Sentiment Analysis of Vietnamese Reviews

Sentiment analysis is an important task in the field ofNature Language P...
03/29/2020

User Generated Data: Achilles' heel of BERT

Pre-trained language models such as BERT are known to perform exceedingl...
10/15/2020

Context-Guided BERT for Targeted Aspect-Based Sentiment Analysis

Aspect-based sentiment analysis (ABSA) and Targeted ASBA (TABSA) allow f...
01/28/2021

BERTaú: Itaú BERT for digital customer service

In the last few years, three major topics received increased interest: d...
05/13/2021

Distilling BERT for low complexity network training

This paper studies the efficiency of transferring BERT learnings to low ...