DeepAI
Log In Sign Up

Class Interference Regularization

09/04/2020
by   Bharti Munjal, et al.
0

Contrastive losses yield state-of-the-art performance for person re-identification, face verification and few shot learning. They have recently outperformed the cross-entropy loss on classification at the ImageNet scale and outperformed all self-supervision prior results by a large margin (SimCLR). Simple and effective regularization techniques such as label smoothing and self-distillation do not apply anymore, because they act on multinomial label distributions, adopted in cross-entropy losses, and not on tuple comparative terms, which characterize the contrastive losses. Here we propose a novel, simple and effective regularization technique, the Class Interference Regularization (CIR), which applies to cross-entropy losses but is especially effective on contrastive losses. CIR perturbs the output features by randomly moving them towards the average embeddings of the negative classes. To the best of our knowledge, CIR is the first regularization technique to act on the output features. In experimental evaluation, the combination of CIR and a plain Siamese-net with triplet loss yields best few-shot learning performance on the challenging tieredImageNet. CIR also improves the state-of-the-art technique in person re-identification on the Market-1501 dataset, based on triplet loss, and the state-of-the-art technique in person search on the CUHK-SYSU dataset, based on a cross-entropy loss. Finally, on the task of classification CIR performs on par with the popular label smoothing, as demonstrated for CIFAR-10 and -100.

READ FULL TEXT
03/08/2022

CIDER: Exploiting Hyperspherical Embeddings for Out-of-Distribution Detection

Out-of-distribution (OOD) detection is a critical task for reliable mach...
06/11/2019

Rethinking Person Re-Identification with Confidence

A common challenge in person re-identification systems is to differentia...
01/14/2021

Label Contrastive Coding based Graph Neural Network for Graph Classification

Graph classification is a critical research problem in many applications...
12/26/2020

Spatial Contrastive Learning for Few-Shot Classification

Existing few-shot classification methods rely to some degree on the cros...
11/05/2020

Intriguing Properties of Contrastive Losses

Contrastive loss and its variants have become very popular recently for ...
04/26/2021

An Exploration into why Output Regularization Mitigates Label Noise

Label noise presents a real challenge for supervised learning algorithms...