DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis

04/28/2020
by   Hu Xu, et al.
0

This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding. We propose DomBERT, an extension of BERT to learn from both in-domain corpus and relevant domain corpora. This helps in learning domain language models with low-resources. Experiments are conducted on an assortment of tasks in aspect-based sentiment analysis, demonstrating promising results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2019

FinBERT: Financial Sentiment Analysis with Pre-trained Language Models

Financial sentiment analysis is a challenging task due to the specialize...
research
08/30/2019

Adapt or Get Left Behind: Domain Adaptation through BERT Language Model Finetuning for Aspect-Target Sentiment Classification

Aspect-Target Sentiment Classification (ATSC) is a subtask of Aspect-Bas...
research
07/20/2022

Doge Tickets: Uncovering Domain-general Language Models by Playing Lottery Tickets

Over-parameterized models, typically pre-trained language models (LMs), ...
research
10/22/2022

Understanding Domain Learning in Language Models Through Subpopulation Analysis

We investigate how different domains are encoded in modern neural networ...
research
03/29/2021

Retraining DistilBERT for a Voice Shopping Assistant by Using Universal Dependencies

In this work, we retrained the distilled BERT language model for Walmart...
research
06/28/2021

Current Landscape of the Russian Sentiment Corpora

Currently, there are more than a dozen Russian-language corpora for sent...
research
06/08/2023

Learning A Foundation Language Model for Geoscience Knowledge Understanding and Utilization

Large language models (LLMs)have achieved great success in general domai...

Please sign up or login with your details

Forgot password? Click here to reset