Accelerating Self-Supervised Learning via Efficient Training Strategies

12/11/2022
by   Mustafa Taha Koçyiğit, et al.
0

Recently the focus of the computer vision community has shifted from expensive supervised learning towards self-supervised learning of visual representations. While the performance gap between supervised and self-supervised has been narrowing, the time for training self-supervised deep networks remains an order of magnitude larger than its supervised counterparts, which hinders progress, imposes carbon cost, and limits societal benefits to institutions with substantial resources. Motivated by these issues, this paper investigates reducing the training time of recent self-supervised methods by various model-agnostic strategies that have not been used for this problem. In particular, we study three strategies: an extendable cyclic learning rate schedule, a matching progressive augmentation magnitude and image resolutions schedule, and a hard positive mining strategy based on augmentation difficulty. We show that all three methods combined lead up to 2.7 times speed-up in the training time of several self-supervised methods while retaining comparable performance to the standard self-supervised learning setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2022

Homomorphic Self-Supervised Learning

In this work, we observe that many existing self-supervised learning alg...
research
06/08/2021

Interpretable agent communication from scratch (with a generic visual processor emerging on the side)

As deep networks begin to be deployed as autonomous agents, the issue of...
research
02/14/2023

Semiconductor Fab Scheduling with Self-Supervised and Reinforcement Learning

Semiconductor manufacturing is a notoriously complex and costly multi-st...
research
01/22/2023

Unifying Synergies between Self-supervised Learning and Dynamic Computation

Self-supervised learning (SSL) approaches have made major strides forwar...
research
04/24/2023

A Cookbook of Self-Supervised Learning

Self-supervised learning, dubbed the dark matter of intelligence, is a p...
research
09/06/2023

ViewMix: Augmentation for Robust Representation in Self-Supervised Learning

Joint Embedding Architecture-based self-supervised learning methods have...
research
12/14/2022

Efficient Self-supervised Learning with Contextualized Target Representations for Vision, Speech and Language

Current self-supervised learning algorithms are often modality-specific ...

Please sign up or login with your details

Forgot password? Click here to reset