Domain-Adaptive Text Classification with Structured Knowledge from Unlabeled Data

06/20/2022
by   Tian Li, et al.
0

Domain adaptive text classification is a challenging problem for the large-scale pretrained language models because they often require expensive additional labeled data to adapt to new domains. Existing works usually fails to leverage the implicit relationships among words across domains. In this paper, we propose a novel method, called Domain Adaptation with Structured Knowledge (DASK), to enhance domain adaptation by exploiting word-level semantic relationships. DASK first builds a knowledge graph to capture the relationship between pivot terms (domain-independent words) and non-pivot terms in the target domain. Then during training, DASK injects pivot-related knowledge graph information into source domain texts. For the downstream task, these knowledge-injected texts are fed into a BERT variant capable of processing knowledge-injected textual data. Thanks to the knowledge injection, our model learns domain-invariant features for non-pivots according to their relationships with pivots. DASK ensures the pivots to have domain-invariant behaviors by dynamically inferring via the polarity scores of candidate pivots during training with pseudo-labels. We validate DASK on a wide range of cross-domain sentiment classification tasks and observe up to 2.9 performance improvement over baselines for 20 different domain pairs. Code will be made available at https://github.com/hikaru-nara/DASK.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2022

Unsupervised Reinforcement Adaptation for Class-Imbalanced Text Classification

Class imbalance naturally exists when train and test models in different...
research
04/18/2023

A Two-Stage Framework with Self-Supervised Distillation For Cross-Domain Text Classification

Cross-domain text classification aims to adapt models to a target domain...
research
09/24/2020

Feature Adaptation of Pre-Trained Language Models across Languages and Domains for Text Classification

Adapting pre-trained language models (PrLMs) (e.g., BERT) to new domains...
research
04/17/2021

Learning to Share by Masking the Non-shared for Multi-domain Sentiment Classification

Multi-domain sentiment classification deals with the scenario where labe...
research
05/12/2021

BertGCN: Transductive Text Classification by Combining GCN and BERT

In this work, we propose BertGCN, a model that combines large scale pret...
research
11/02/2018

Transductive Learning with String Kernels for Cross-Domain Text Classification

For many text classification tasks, there is a major problem posed by th...
research
10/19/2018

Revisiting Distributional Correspondence Indexing: A Python Reimplementation and New Experiments

This paper introduces PyDCI, a new implementation of Distributional Corr...

Please sign up or login with your details

Forgot password? Click here to reset