Conditional Contrastive Learning with Kernel

02/11/2022
by   Yao-Hung Hubert Tsai, et al.
10

Conditional contrastive learning frameworks consider the conditional sampling procedure that constructs positive or negative data pairs conditioned on specific variables. Fair contrastive learning constructs negative pairs, for example, from the same gender (conditioning on sensitive information), which in turn reduces undesirable information from the learned representations; weakly supervised contrastive learning constructs positive pairs with similar annotative attributes (conditioning on auxiliary information), which in turn are incorporated into the representations. Although conditional contrastive learning enables many applications, the conditional sampling procedure can be challenging if we cannot obtain sufficient data pairs for some values of the conditioning variable. This paper presents Conditional Contrastive Learning with Kernel (CCL-K) that converts existing conditional contrastive objectives into alternative forms that mitigate the insufficient data problem. Instead of sampling data according to the value of the conditioning variable, CCL-K uses the Kernel Conditional Embedding Operator that samples data from all available data and assigns weights to each sampled data given the kernel similarity between the values of the conditioning variable. We conduct experiments using weakly supervised, fair, and hard negatives contrastive learning, showing CCL-K outperforms state-of-the-art baselines.

READ FULL TEXT

page 5

page 16

research
06/03/2022

Rethinking Positive Sampling for Contrastive Learning with Kernel

Data augmentation is a crucial component in unsupervised contrastive lea...
research
03/31/2023

Weakly-Supervised Text-driven Contrastive Learning for Facial Behavior Understanding

Contrastive learning has shown promising potential for learning robust r...
research
05/23/2022

Conditional Supervised Contrastive Learning for Fair Text Classification

Contrastive representation learning has gained much attention due to its...
research
03/15/2022

Improving Event Representation via Simultaneous Weakly Supervised Contrastive Learning and Clustering

Representations of events described in text are important for various ta...
research
06/08/2021

Provable Guarantees for Self-Supervised Deep Learning with Spectral Contrastive Loss

Recent works in self-supervised learning have advanced the state-of-the-...
research
06/17/2021

Prototypical Graph Contrastive Learning

Graph-level representations are critical in various real-world applicati...
research
03/30/2022

Weakly-supervised Temporal Path Representation Learning with Contrastive Curriculum Learning – Extended Version

In step with the digitalization of transportation, we are witnessing a g...

Please sign up or login with your details

Forgot password? Click here to reset