A Survey on Self-supervised Pre-training for Sequential Transfer Learning in Neural Networks

07/01/2020
by   Huanru Henry Mao, et al.
0

Deep neural networks are typically trained under a supervised learning framework where a model learns a single task using labeled data. Instead of relying solely on labeled data, practitioners can harness unlabeled or related data to improve model performance, which is often more accessible and ubiquitous. Self-supervised pre-training for transfer learning is becoming an increasingly popular technique to improve state-of-the-art results using unlabeled data. It involves first pre-training a model on a large amount of unlabeled data, then adapting the model to target tasks of interest. In this review, we survey self-supervised learning methods and their applications within the sequential transfer learning framework. We provide an overview of the taxonomy for self-supervised learning and transfer learning, and highlight some prominent methods for designing pre-training tasks across different domains. Finally, we discuss recent trends and suggest areas for future investigation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2021

Robust wav2vec 2.0: Analyzing Domain Shift in Self-Supervised Pre-Training

Self-supervised learning of speech representations has been a very activ...
research
11/18/2021

Boosting Supervised Learning Performance with Co-training

Deep learning perception models require a massive amount of labeled trai...
research
05/28/2022

Applying Self-Supervised Learning to Medicine: Review of the State of the Art and Medical Implementations

Machine learning has become an increasingly ubiquitous technology, as bi...
research
03/25/2022

Intelligent Masking: Deep Q-Learning for Context Encoding in Medical Image Analysis

The need for a large amount of labeled data in the supervised setting ha...
research
02/07/2022

Simple Control Baselines for Evaluating Transfer Learning

Transfer learning has witnessed remarkable progress in recent years, for...
research
04/25/2021

How Well Self-Supervised Pre-Training Performs with Streaming Data?

The common self-supervised pre-training practice requires collecting mas...
research
03/20/2023

Cocktail HuBERT: Generalized Self-Supervised Pre-training for Mixture and Single-Source Speech

Self-supervised learning leverages unlabeled data effectively, improving...

Please sign up or login with your details

Forgot password? Click here to reset