Does Double Descent Occur in Self-Supervised Learning?

07/15/2023
by   Alisia Lupidi, et al.
0

Most investigations into double descent have focused on supervised models while the few works studying self-supervised settings find a surprising lack of the phenomenon. These results imply that double descent may not exist in self-supervised models. We show this empirically using a standard and linear autoencoder, two previously unstudied settings. The test loss is found to have either a classical U-shape or to monotonically decrease instead of exhibiting a double-descent curve. We hope that further work on this will help elucidate the theoretical underpinnings of this phenomenon.

READ FULL TEXT

page 2

page 3

research
03/24/2023

Double Descent Demystified: Identifying, Interpreting Ablating the Sources of a Deep Learning Puzzle

Double descent is a surprising phenomenon in machine learning, in which ...
research
11/18/2022

Understanding the double descent curve in Machine Learning

The theory of bias-variance used to serve as a guide for model selection...
research
06/17/2022

Sparse Double Descent: Where Network Pruning Aggravates Overfitting

People usually believe that network pruning not only reduces the computa...
research
10/02/2022

What shapes the loss landscape of self-supervised learning?

Prevention of complete and dimensional collapse of representations has r...
research
07/26/2023

Sparse Double Descent in Vision Transformers: real or phantom threat?

Vision transformers (ViT) have been of broad interest in recent theoreti...
research
10/19/2020

Do Deeper Convolutional Networks Perform Better?

Over-parameterization is a recent topic of much interest in the machine ...
research
05/25/2023

Dropout Drops Double Descent

In this paper, we find and analyze that we can easily drop the double de...

Please sign up or login with your details

Forgot password? Click here to reset