Cross-Domain Sentiment Classification With Contrastive Learning and Mutual Information Maximization

10/30/2020
by   Tian Li, et al.
0

Contrastive learning (CL) has been successful as a powerful representation learning method. In this work we propose CLIM: Contrastive Learning with mutual Information Maximization, to explore the potential of CL on cross-domain sentiment classification. To the best of our knowledge, CLIM is the first to adopt contrastive learning for natural language processing (NLP) tasks across domains. Due to scarcity of labels on the target domain, we introduce mutual information maximization (MIM) apart from CL to exploit the features that best support the final prediction. Furthermore, MIM is able to maintain a relatively balanced distribution of the model's prediction, and enlarges the margin between classes on the target domain. The larger margin increases our model's robustness and enables the same classifier to be optimal across domains. Consequently, we achieve new state-of-the-art results on the Amazon-review dataset as well as the airlines dataset, showing the efficacy of our proposed method CLIM.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2020

Cross-Domain Sentiment Classification with In-Domain Contrastive Learning

Contrastive learning (CL) has been successful as a powerful representati...
research
05/02/2023

Long-Tailed Recognition by Mutual Information Maximization between Latent Features and Ground-Truth Labels

Although contrastive learning methods have shown prevailing performance ...
research
08/18/2022

Mere Contrastive Learning for Cross-Domain Sentiment Analysis

Cross-domain sentiment analysis aims to predict the sentiment of texts i...
research
06/04/2020

Info3D: Representation Learning on 3D Objects using Mutual Information Maximization and Contrastive Learning

A major endeavor of computer vision is to represent, understand and extr...
research
08/27/2021

ProtoInfoMax: Prototypical Networks with Mutual Information Maximization for Out-of-Domain Detection

The ability to detect Out-of-Domain (OOD) inputs has been a critical req...
research
12/02/2020

Learning View-Disentangled Human Pose Representation by Contrastive Cross-View Mutual Information Maximization

We introduce a novel representation learning method to disentangle pose-...
research
06/13/2019

Contrastive Bidirectional Transformer for Temporal Representation Learning

This paper aims at learning representations for long sequences of contin...

Please sign up or login with your details

Forgot password? Click here to reset