Understanding Pre-trained BERT for Aspect-based Sentiment Analysis

10/31/2020
by   Hu Xu, et al.
12

This paper analyzes the pre-trained hidden representations learned from reviews on BERT for tasks in aspect-based sentiment analysis (ABSA). Our work is motivated by the recent progress in BERT-based language models for ABSA. However, it is not clear how the general proxy task of (masked) language model trained on unlabeled corpus without annotations of aspects or opinions can provide important features for downstream tasks in ABSA. By leveraging the annotated datasets in ABSA, we investigate both the attentions and the learned representations of BERT pre-trained on reviews. We found that BERT uses very few self-attention heads to encode context words (such as prepositions or pronouns that indicating an aspect) and opinion words for an aspect. Most features in the representation of an aspect are dedicated to the fine-grained semantics of the domain (or product category) and the aspect itself, instead of carrying summarized opinions from its context. We hope this investigation can help future research in improving self-supervised learning, unsupervised learning and fine-tuning for ABSA. The pre-trained model and code can be found at https://github.com/howardhsu/BERT-for-RRC-ABSA.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2020

Exploiting BERT to improve aspect-based sentiment analysis performance on Persian language

Aspect-based sentiment analysis (ABSA) is a more detailed task in sentim...
research
03/30/2022

Incorporating Dynamic Semantics into Pre-Trained Language Model for Aspect-based Sentiment Analysis

Aspect-based sentiment analysis (ABSA) predicts sentiment polarity towar...
research
07/14/2021

BERT Fine-Tuning for Sentiment Analysis on Indonesian Mobile Apps Reviews

User reviews have an essential role in the success of the developed mobi...
research
05/06/2022

Disentangled Learning of Stance and Aspect Topics for Vaccine Attitude Detection in Social Media

Building models to detect vaccine attitudes on social media is challengi...
research
10/15/2020

Context-Guided BERT for Targeted Aspect-Based Sentiment Analysis

Aspect-based sentiment analysis (ABSA) and Targeted ASBA (TABSA) allow f...
research
09/07/2020

E-BERT: A Phrase and Product Knowledge Enhanced Language Model for E-commerce

Pre-trained language models such as BERT have achieved great success in ...
research
04/14/2021

Disentangling Representations of Text by Masking Transformers

Representations from large pretrained models such as BERT encode a range...

Please sign up or login with your details

Forgot password? Click here to reset