MetAug: Contrastive Learning via Meta Feature Augmentation

03/10/2022
by   Jiangmeng Li, et al.
0

What matters for contrastive learning? We argue that contrastive learning heavily relies on informative features, or "hard" (positive or negative) features. Early works include more informative features by applying complex data augmentations and large batch size or memory bank, and recent works design elaborate sampling approaches to explore informative features. The key challenge toward exploring such features is that the source multi-view data is generated by applying random data augmentations, making it infeasible to always add useful information in the augmented data. Consequently, the informativeness of features learned from such augmented data is limited. In response, we propose to directly augment the features in latent space, thereby learning discriminative representations without a large amount of input data. We perform a meta learning technique to build the augmentation generator that updates its network parameters by considering the performance of the encoder. However, insufficient input data may lead the encoder to learn collapsed features and therefore malfunction the augmentation generator. A new margin-injected regularization is further added in the objective function to avoid the encoder learning a degenerate mapping. To contrast all features in one gradient back-propagation step, we adopt the proposed optimization-driven unified contrastive loss instead of the conventional contrastive loss. Empirically, our method achieves state-of-the-art results on several benchmark datasets.

READ FULL TEXT
research
04/16/2023

Meta-optimized Contrastive Learning for Sequential Recommendation

Contrastive Learning (CL) performances as a rising approach to address t...
research
02/03/2023

Contrastive Learning with Consistent Representations

Contrastive learning demonstrates great promise for representation learn...
research
04/28/2022

Keep the Caption Information: Preventing Shortcut Learning in Contrastive Image-Caption Retrieval

To train image-caption retrieval (ICR) methods, contrastive loss functio...
research
10/01/2022

Heterogeneous Graph Contrastive Multi-view Learning

Inspired by the success of contrastive learning (CL) in computer vision ...
research
05/02/2022

CCLF: A Contrastive-Curiosity-Driven Learning Framework for Sample-Efficient Reinforcement Learning

In reinforcement learning (RL), it is challenging to learn directly from...
research
01/18/2021

Scaling Deep Contrastive Learning Batch Size with Almost Constant Peak Memory Usage

Contrastive learning has been applied successfully to learn numerical ve...
research
07/06/2023

Contrast Is All You Need

In this study, we analyze data-scarce classification scenarios, where av...

Please sign up or login with your details

Forgot password? Click here to reset