On the Memorization Properties of Contrastive Learning

07/21/2021
by   Ildus Sadrtdinov, et al.
0

Memorization studies of deep neural networks (DNNs) help to understand what patterns and how do DNNs learn, and motivate improvements to DNN training approaches. In this work, we investigate the memorization properties of SimCLR, a widely used contrastive self-supervised learning approach, and compare them to the memorization of supervised learning and random labels training. We find that both training objects and augmentations may have different complexity in the sense of how SimCLR learns them. Moreover, we show that SimCLR is similar to random labels training in terms of the distribution of training objects complexity.

READ FULL TEXT

page 6

page 8

research
03/08/2023

Self-Supervised Learning for Group Equivariant Neural Networks

This paper proposes a method to construct pretext tasks for self-supervi...
research
11/21/2019

Neural Network Memorization Dissection

Deep neural networks (DNNs) can easily fit a random labeling of the trai...
research
12/28/2021

To Supervise or Not: How to Effectively Learn Wireless Interference Management Models?

Machine learning has become successful in solving wireless interference ...
research
06/06/2021

Self-supervised Rubik's Cube Solver

This work demonstrates that deep neural networks (DNNs) can solve a comb...
research
10/06/2021

Exploring the Common Principal Subspace of Deep Features in Neural Networks

We find that different Deep Neural Networks (DNNs) trained with the same...
research
09/12/2023

ssVERDICT: Self-Supervised VERDICT-MRI for Enhanced Prostate Tumour Characterisation

MRI is increasingly being used in the diagnosis of prostate cancer (PCa)...
research
02/03/2021

Length Learning for Planar Euclidean Curves

In this work, we used deep neural networks (DNNs) to solve a fundamental...

Please sign up or login with your details

Forgot password? Click here to reset