Evolutionary Augmentation Policy Optimization for Self-supervised Learning

03/02/2023
by   Noah Barrett, et al.
0

Self-supervised learning (SSL) is a Machine Learning algorithm for pretraining Deep Neural Networks (DNNs) without requiring manually labeled data. The central idea of this learning technique is based on an auxiliary stage aka pretext task in which labeled data are created automatically through data augmentation and exploited for pretraining the DNN. However, the effect of each pretext task is not well studied or compared in the literature. In this paper, we study the contribution of augmentation operators on the performance of self supervised learning algorithms in a constrained settings. We propose an evolutionary search method for optimization of data augmentation pipeline in pretext tasks and measure the impact of augmentation operators in several SOTA SSL algorithms. By encoding different combination of augmentation operators in chromosomes we seek the optimal augmentation policies through an evolutionary optimization mechanism. We further introduce methods for analyzing and explaining the performance of optimized SSL algorithms. Our results indicate that our proposed method can find solutions that outperform the accuracy of classification of SSL algorithms which confirms the influence of augmentation policy choice on the overall performance of SSL algorithms. We also compare optimal SSL solutions found by our evolutionary search mechanism and show the effect of batch size in the pretext task on two visual datasets.

READ FULL TEXT

page 9

page 12

page 13

page 14

page 17

research
05/24/2022

Multi-Augmentation for Efficient Visual Representation Learning for Self-supervised Pre-training

In recent years, self-supervised learning has been studied to deal with ...
research
07/16/2022

On the Importance of Hyperparameters and Data Augmentation for Self-Supervised Learning

Self-Supervised Learning (SSL) has become a very active area of Deep Lea...
research
10/26/2021

TUNet: A Block-online Bandwidth Extension Model based on Transformers and Self-supervised Pretraining

We introduce a block-online variant of the temporal feature-wise linear ...
research
09/16/2020

Evaluating Self-Supervised Pretraining Without Using Labels

A common practice in unsupervised representation learning is to use labe...
research
04/05/2023

Exploring the Utility of Self-Supervised Pretraining Strategies for the Detection of Absent Lung Sliding in M-Mode Lung Ultrasound

Self-supervised pretraining has been observed to improve performance in ...
research
03/26/2021

DivAug: Plug-in Automated Data Augmentation with Explicit Diversity Maximization

Human-designed data augmentation strategies have been replaced by automa...
research
06/06/2021

Self-supervised Rubik's Cube Solver

This work demonstrates that deep neural networks (DNNs) can solve a comb...

Please sign up or login with your details

Forgot password? Click here to reset