Improving Domain-Invariance in Self-Supervised Learning via Batch Styles Standardization

03/10/2023
by   Marin Scalbert, et al.
0

The recent rise of Self-Supervised Learning (SSL) as one of the preferred strategies for learning with limited labeled data, and abundant unlabeled data has led to the widespread use of these models. They are usually pretrained, finetuned, and evaluated on the same data distribution, i.e., within an in-distribution setting. However, they tend to perform poorly in out-of-distribution evaluation scenarios, a challenge that Unsupervised Domain Generalization (UDG) seeks to address. This paper introduces a novel method to standardize the styles of images in a batch. Batch styles standardization, relying on Fourier-based augmentations, promotes domain invariance in SSL by preventing spurious correlations from leaking into the features. The combination of batch styles standardization with the well-known contrastive-based method SimCLR leads to a novel UDG method named CLaSSy (Contrastive Learning with Standardized Styles). CLaSSy offers serious advantages over prior methods, as it does not rely on domain labels and is scalable to handle a large number of domains. Experimental results on various UDG datasets demonstrate the superior performance of CLaSSy compared to existing UDG methods. Finally, the versatility of the proposed batch styles standardization is demonstrated by extending respectively the contrastive-based and non-contrastive-based SSL methods, SWaV and MSN, while considering different backbone architectures (convolutional-based, transformers-based).

READ FULL TEXT

page 3

page 4

page 8

research
06/28/2023

Multi-network Contrastive Learning Based on Global and Local Representations

The popularity of self-supervised learning has made it possible to train...
research
03/05/2021

Extending Contrastive Learning to Unsupervised Coreset Selection

Self-supervised contrastive learning offers a means of learning informat...
research
09/21/2023

DimCL: Dimensional Contrastive Learning For Improving Self-Supervised Learning

Self-supervised learning (SSL) has gained remarkable success, for which ...
research
11/09/2020

Towards Domain-Agnostic Contrastive Learning

Despite recent success, most contrastive self-supervised learning method...
research
09/13/2020

Contrastive Self-supervised Learning for Graph Classification

Graph classification is a widely studied problem and has broad applicati...
research
11/21/2021

Domain Generalization for Mammography Detection via Multi-style and Multi-view Contrastive Learning

Lesion detection is a fundamental problem in the computer-aided diagnosi...
research
10/15/2021

Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science

Deep learning techniques have been increasingly applied to the natural s...

Please sign up or login with your details

Forgot password? Click here to reset