Masked Reconstruction Contrastive Learning with Information Bottleneck Principle

11/15/2022
by   Ziwen Liu, et al.
0

Contrastive learning (CL) has shown great power in self-supervised learning due to its ability to capture insight correlations among large-scale data. Current CL models are biased to learn only the ability to discriminate positive and negative pairs due to the discriminative task setting. However, this bias would lead to ignoring its sufficiency for other downstream tasks, which we call the discriminative information overfitting problem. In this paper, we propose to tackle the above problems from the aspect of the Information Bottleneck (IB) principle, further pushing forward the frontier of CL. Specifically, we present a new perspective that CL is an instantiation of the IB principle, including information compression and expression. We theoretically analyze the optimal information situation and demonstrate that minimum sufficient augmentation and information-generalized representation are the optimal requirements for achieving maximum compression and generalizability to downstream tasks. Therefore, we propose the Masked Reconstruction Contrastive Learning (MRCL) model to improve CL models. For implementation in practice, MRCL utilizes the masking operation for stronger augmentation, further eliminating redundant and noisy information. In order to alleviate the discriminative information overfitting problem effectively, we employ the reconstruction task to regularize the discriminative task. We conduct comprehensive experiments and show the superiority of the proposed model on multiple tasks, including image classification, semantic segmentation and objective detection.

READ FULL TEXT
research
06/01/2022

Rethinking the Augmentation Module in Contrastive Learning: Learning Hierarchical Augmentation Invariance with Expanded Views

A data augmentation module is utilized in contrastive learning to transf...
research
11/20/2022

Towards Generalizable Graph Contrastive Learning: An Information Theory Perspective

Graph contrastive learning (GCL) emerges as the most representative appr...
research
09/22/2022

An Information Minimization Based Contrastive Learning Model for Unsupervised Sentence Embeddings Learning

Unsupervised sentence embeddings learning has been recently dominated by...
research
12/16/2022

Feature Dropout: Revisiting the Role of Augmentations in Contrastive Learning

What role do augmentations play in contrastive learning? Recent work sug...
research
07/22/2023

Hallucination Improves the Performance of Unsupervised Visual Representation Learning

Contrastive learning models based on Siamese structure have demonstrated...
research
08/06/2021

Improving Contrastive Learning by Visualizing Feature Transformation

Contrastive learning, which aims at minimizing the distance between posi...
research
03/06/2023

Reducing Spurious Correlations for Aspect-Based Sentiment Analysis with Variational Information Bottleneck and Contrastive Learning

Deep learning techniques have dominated the literature on aspect-based s...

Please sign up or login with your details

Forgot password? Click here to reset