DeepAI AI Chat
Log In Sign Up

Exploiting BERT for End-to-End Aspect-based Sentiment Analysis

10/02/2019
by   Xin Li, et al.
Alibaba Group
The Chinese University of Hong Kong
0

In this paper, we investigate the modeling power of contextualized embeddings from pre-trained language models, e.g. BERT, on the E2E-ABSA task. Specifically, we build a series of simple yet insightful neural baselines to deal with E2E-ABSA. The experimental results show that even with a simple linear classification layer, our BERT-based architecture can outperform state-of-the-art works. Besides, we also standardize the comparative study by consistently utilizing a hold-out validation dataset for model selection, which is largely ignored by previous works. Therefore, our work can serve as a BERT-based benchmark for E2E-ABSA.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/28/2021

Arabic aspect based sentiment analysis using BERT

Aspect-based sentiment analysis(ABSA) is a textual analysis methodology ...
02/12/2020

Utilizing BERT Intermediate Layers for Aspect Based Sentiment Analysis and Natural Language Inference

Aspect based sentiment analysis aims to identify the sentimental tendenc...
03/29/2020

User Generated Data: Achilles' heel of BERT

Pre-trained language models such as BERT are known to perform exceedingl...
05/27/2020

Catching Attention with Automatic Pull Quote Selection

Pull quotes are an effective component of a captivating news article. Th...
04/24/2023

Pre-trained Embeddings for Entity Resolution: An Experimental Analysis [Experiment, Analysis Benchmark]

Many recent works on Entity Resolution (ER) leverage Deep Learning techn...
04/12/2020

AMR Parsing via Graph-Sequence Iterative Inference

We propose a new end-to-end model that treats AMR parsing as a series of...