Asymmetric Patch Sampling for Contrastive Learning

06/05/2023
by   Chengchao Shen, et al.
0

Asymmetric appearance between positive pair effectively reduces the risk of representation degradation in contrastive learning. However, there are still a mass of appearance similarities between positive pair constructed by the existing methods, which inhibits the further representation improvement. In this paper, we propose a novel asymmetric patch sampling strategy for contrastive learning, to further boost the appearance asymmetry for better representations. Specifically, dual patch sampling strategies are applied to the given image, to obtain asymmetric positive pairs. First, sparse patch sampling is conducted to obtain the first view, which reduces spatial redundancy of image and allows a more asymmetric view. Second, a selective patch sampling is proposed to construct another view with large appearance discrepancy relative to the first one. Due to the inappreciable appearance similarity between positive pair, the trained model is encouraged to capture the similarity on semantics, instead of low-level ones. Experimental results demonstrate that our proposed method significantly outperforms the existing self-supervised methods on both ImageNet-1K and CIFAR dataset, e.g., 2.5 finetune accuracy improvement on CIFAR100. Furthermore, our method achieves state-of-the-art performance on downstream tasks, object detection and instance segmentation on COCO.Additionally, compared to other self-supervised methods, our method is more efficient on both memory and computation during training. The source code is available at https://github.com/visresearch/aps.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2023

Inter-Instance Similarity Modeling for Contrastive Learning

The existing contrastive learning methods widely adopt one-hot instance ...
research
02/01/2022

HCSC: Hierarchical Contrastive Selective Coding

Hierarchical semantic structures naturally exist in an image dataset, in...
research
08/22/2022

Anatomy-Aware Contrastive Representation Learning for Fetal Ultrasound

Self-supervised contrastive representation learning offers the advantage...
research
04/20/2021

SelfReg: Self-supervised Contrastive Regularization for Domain Generalization

In general, an experimental environment for deep learning assumes that t...
research
03/04/2023

Towards a Unified Theoretical Understanding of Non-contrastive Learning via Rank Differential Mechanism

Recently, a variety of methods under the name of non-contrastive learnin...
research
06/10/2021

Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations

Contrastive self-supervised learning has outperformed supervised pretrai...
research
08/18/2023

Rethinking Image Forgery Detection via Contrastive Learning and Unsupervised Clustering

Image forgery detection aims to detect and locate forged regions in an i...

Please sign up or login with your details

Forgot password? Click here to reset