Learning Invariant Representations using Inverse Contrastive Loss

02/16/2021
by   Aditya Kumar Akash, et al.
0

Learning invariant representations is a critical first step in a number of machine learning tasks. A common approach corresponds to the so-called information bottleneck principle in which an application dependent function of mutual information is carefully chosen and optimized. Unfortunately, in practice, these functions are not suitable for optimization purposes since these losses are agnostic of the metric structure of the parameters of the model. We introduce a class of losses for learning representations that are invariant to some extraneous variable of interest by inverting the class of contrastive losses, i.e., inverse contrastive loss (ICL). We show that if the extraneous variable is binary, then optimizing ICL is equivalent to optimizing a regularized MMD divergence. More generally, we also show that if we are provided a metric on the sample space, our formulation of ICL can be decomposed into a sum of convex functions of the given distance metric. Our experimental results indicate that models obtained by optimizing ICL achieve significantly better invariance to the extraneous variable for a fixed desired level of accuracy. In a variety of experimental settings, we show applicability of ICL for learning invariant representations for both continuous and discrete extraneous variables.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2021

Staying in Shape: Learning Invariant Shape Representations using Contrastive Learning

Creating representations of shapes that are invari-ant to isometric or a...
research
01/12/2022

Robust Contrastive Learning against Noisy Views

Contrastive learning relies on an assumption that positive pairs contain...
research
10/28/2021

InfoGCL: Information-Aware Graph Contrastive Learning

Various graph contrastive learning models have been proposed to improve ...
research
10/15/2022

Augmentation-Free Graph Contrastive Learning of Invariant-Discriminative Representations

The pretasks are mainly built on mutual information estimation, which re...
research
06/25/2021

Decomposed Mutual Information Estimation for Contrastive Representation Learning

Recent contrastive representation learning methods rely on estimating mu...
research
05/04/2023

Contrastive losses as generalized models of global epistasis

Fitness functions map large combinatorial spaces of biological sequences...
research
02/07/2020

Inverse Learning of Symmetry Transformations

Symmetry transformations induce invariances and are a crucial building b...

Please sign up or login with your details

Forgot password? Click here to reset