Towards Domain-Agnostic Contrastive Learning

11/09/2020
by   Vikas Verma, et al.
0

Despite recent success, most contrastive self-supervised learning methods are domain-specific, relying heavily on data augmentation techniques that require knowledge about a particular domain, such as image cropping and rotation. To overcome such limitation, we propose a novel domain-agnostic approach to contrastive learning, named DACL, that is applicable to domains where invariances, and thus, data augmentation techniques, are not readily available. Key to our approach is the use of Mixup noise to create similar and dissimilar examples by mixing data samples differently either at the input or hidden-state levels. To demonstrate the effectiveness of DACL, we conduct experiments across various domains such as tabular data, images, and graphs. Our results show that DACL not only outperforms other domain-agnostic noising methods, such as Gaussian-noise, but also combines well with domain-specific methods, such as SimCLR, to improve self-supervised visual representation learning. Finally, we theoretically analyze our method and show advantages over the Gaussian-noise based contrastive learning approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/26/2021

AAVAE: Augmentation-Augmented Variational Autoencoders

Recent methods for self-supervised learning can be grouped into two para...
research
08/27/2021

Contrastive Mixup: Self- and Semi-Supervised learning for Tabular Domain

Recent literature in self-supervised has demonstrated significant progre...
research
03/02/2022

The Optimal Noise in Noise-Contrastive Learning Is Not What You Think

Learning a parametric model of a data distribution is a well-known stati...
research
06/09/2021

Neighborhood Contrastive Learning Applied to Online Patient Monitoring

Intensive care units (ICU) are increasingly looking towards machine lear...
research
10/24/2022

Non-Contrastive Learning-based Behavioural Biometrics for Smart IoT Devices

Behaviour biometrics are being explored as a viable alternative to overc...
research
11/23/2021

Domain-Agnostic Clustering with Self-Distillation

Recent advancements in self-supervised learning have reduced the gap bet...
research
03/10/2023

Improving Domain-Invariance in Self-Supervised Learning via Batch Styles Standardization

The recent rise of Self-Supervised Learning (SSL) as one of the preferre...

Please sign up or login with your details

Forgot password? Click here to reset