Understanding and Improving the Role of Projection Head in Self-Supervised Learning

12/22/2022
by   Kartik Gupta, et al.
4

Self-supervised learning (SSL) aims to produce useful feature representations without access to any human-labeled data annotations. Due to the success of recent SSL methods based on contrastive learning, such as SimCLR, this problem has gained popularity. Most current contrastive learning approaches append a parametrized projection head to the end of some backbone network to optimize the InfoNCE objective and then discard the learned projection head after training. This raises a fundamental question: Why is a learnable projection head required if we are to discard it after training? In this work, we first perform a systematic study on the behavior of SSL training focusing on the role of the projection head layers. By formulating the projection head as a parametric component for the InfoNCE objective rather than a part of the network, we present an alternative optimization scheme for training contrastive learning based SSL frameworks. Our experimental study on multiple image classification datasets demonstrates the effectiveness of the proposed approach over alternatives in the SSL literature.

READ FULL TEXT
research
07/18/2023

Towards the Sparseness of Projection Head in Self-Supervised Learning

In recent years, self-supervised learning (SSL) has emerged as a promisi...
research
01/28/2023

Deciphering the Projection Head: Representation Evaluation Self-supervised Learning

Self-supervised learning (SSL) aims to learn intrinsic features without ...
research
10/22/2020

Contrastive Self-Supervised Learning for Wireless Power Control

We propose a new approach for power control in wireless networks using s...
research
09/21/2023

DimCL: Dimensional Contrastive Learning For Improving Self-Supervised Learning

Self-supervised learning (SSL) has gained remarkable success, for which ...
research
05/12/2022

The Mechanism of Prediction Head in Non-contrastive Self-supervised Learning

Recently the surprising discovery of the Bootstrap Your Own Latent (BYOL...
research
11/26/2022

Supervised Contrastive Prototype Learning: Augmentation Free Robust Neural Network

Transformations in the input space of Deep Neural Networks (DNN) lead to...
research
04/23/2021

DeepfakeUCL: Deepfake Detection via Unsupervised Contrastive Learning

Face deepfake detection has seen impressive results recently. Nearly all...

Please sign up or login with your details

Forgot password? Click here to reset