Equivariance and Invariance Inductive Bias for Learning from Insufficient Data

07/25/2022
by   Tan Wang, et al.
0

We are interested in learning robust models from insufficient data, without the need for any externally pre-trained checkpoints. First, compared to sufficient data, we show why insufficient data renders the model more easily biased to the limited training environments that are usually different from testing. For example, if all the training swan samples are "white", the model may wrongly use the "white" environment to represent the intrinsic class swan. Then, we justify that equivariance inductive bias can retain the class feature while invariance inductive bias can remove the environmental feature, leaving the class feature that generalizes to any environmental changes in testing. To impose them on learning, for equivariance, we demonstrate that any off-the-shelf contrastive-based self-supervised feature learning method can be deployed; for invariance, we propose a class-wise invariant risk minimization (IRM) that efficiently tackles the challenge of missing environmental annotation in conventional IRM. State-of-the-art experimental results on real-world benchmarks (VIPriors, ImageNet100 and NICO) validate the great potential of equivariance and invariance in data-efficient learning. The code is available at https://github.com/Wangt-CN/EqInv

READ FULL TEXT

page 22

page 24

page 26

research
08/06/2022

Class Is Invariant to Context and Vice Versa: On Learning Invariance for Out-Of-Distribution Generalization

Out-Of-Distribution generalization (OOD) is all about learning invarianc...
research
04/11/2021

Disentangled Contrastive Learning for Learning Robust Textual Representations

Although the self-supervised pre-training of transformer models has resu...
research
03/30/2021

PiCIE: Unsupervised Semantic Segmentation using Invariance and Equivariance in Clustering

We present a new framework for semantic segmentation without annotations...
research
03/21/2022

A Contrastive Objective for Learning Disentangled Representations

Learning representations of images that are invariant to sensitive or un...
research
09/04/2023

CA2: Class-Agnostic Adaptive Feature Adaptation for One-class Classification

One-class classification (OCC), i.e., identifying whether an example bel...
research
07/17/2022

HyperInvariances: Amortizing Invariance Learning

Providing invariances in a given learning task conveys a key inductive b...
research
09/20/2021

Viewpoint Invariant Dense Matching for Visual Geolocalization

In this paper we propose a novel method for image matching based on dens...

Please sign up or login with your details

Forgot password? Click here to reset