SelfReg: Self-supervised Contrastive Regularization for Domain Generalization

04/20/2021
by   Daehee Kim, et al.
0

In general, an experimental environment for deep learning assumes that the training and the test dataset are sampled from the same distribution. However, in real-world situations, a difference in the distribution between two datasets, domain shift, may occur, which becomes a major factor impeding the generalization performance of the model. The research field to solve this problem is called domain generalization, and it alleviates the domain shift problem by extracting domain-invariant features explicitly or implicitly. In recent studies, contrastive learning-based domain generalization approaches have been proposed and achieved high performance. These approaches require sampling of the negative data pair. However, the performance of contrastive learning fundamentally depends on quality and quantity of negative data pairs. To address this issue, we propose a new regularization method for domain generalization based on contrastive learning, self-supervised contrastive regularization (SelfReg). The proposed approach use only positive data pairs, thus it resolves various problems caused by negative pair sampling. Moreover, we propose a class-specific domain perturbation layer (CDPL), which makes it possible to effectively apply mixup augmentation even when only positive data pairs are used. The experimental results show that the techniques incorporated by SelfReg contributed to the performance in a compatible manner. In the recent benchmark, DomainBed, the proposed method shows comparable performance to the conventional state-of-the-art alternatives. Codes are available at https://github.com/dnap512/SelfReg.

READ FULL TEXT

page 2

page 4

page 5

page 7

page 8

page 9

page 10

page 13

research
08/23/2022

IMPaSh: A Novel Domain-shift Resistant Representation for Colorectal Cancer Tissue Classification

The appearance of histopathology images depends on tissue type, staining...
research
04/07/2023

Supervised Contrastive Learning with Heterogeneous Similarity for Distribution Shifts

Distribution shifts are problems where the distribution of data changes ...
research
02/12/2021

Understanding self-supervised Learning Dynamics without Contrastive Pairs

Contrastive approaches to self-supervised learning (SSL) learn represent...
research
12/16/2020

ISD: Self-Supervised Learning by Iterative Similarity Distillation

Recently, contrastive learning has achieved great results in self-superv...
research
12/08/2022

Generating and Weighting Semantically Consistent Sample Pairs for Ultrasound Contrastive Learning

Well-annotated medical datasets enable deep neural networks (DNNs) to ga...
research
06/05/2023

Asymmetric Patch Sampling for Contrastive Learning

Asymmetric appearance between positive pair effectively reduces the risk...
research
03/06/2023

Reducing Spurious Correlations for Aspect-Based Sentiment Analysis with Variational Information Bottleneck and Contrastive Learning

Deep learning techniques have dominated the literature on aspect-based s...

Please sign up or login with your details

Forgot password? Click here to reset