Leveraging Random Label Memorization for Unsupervised Pre-Training

We present a novel approach to leverage large unlabeled datasets by pre-training state-of-the-art deep neural networks on randomly-labeled datasets. Specifically, we train the neural networks to memorize arbitrary labels for all the samples in a dataset and use these pre-trained networks as a starting point for regular supervised learning. Our assumption is that the "memorization infrastructure" learned by the network during the random-label training proves to be beneficial for the conventional supervised learning as well. We test the effectiveness of our pre-training on several video action recognition datasets (HMDB51, UCF101, Kinetics) by comparing the results of the same network with and without the random label pre-training. Our approach yields an improvement - ranging from 1.5 classification accuracy, which calls for further research in this direction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2019

Large-scale weakly-supervised pre-training for video action recognition

Current fully-supervised video datasets consist of only a few hundred th...
research
05/02/2023

BrainNPT: Pre-training of Transformer networks for brain network classification

Deep learning methods have advanced quickly in brain imaging analysis ov...
research
11/07/2016

Adversarial Ladder Networks

The use of unsupervised data in addition to supervised data in training ...
research
12/03/2009

Behavior and performance of the deep belief networks on image classification

We apply deep belief networks of restricted Boltzmann machines to bags o...
research
01/11/2021

Learning from Weakly-labeled Web Videos via Exploring Sub-Concepts

Learning visual knowledge from massive weakly-labeled web videos has att...
research
05/18/2022

Deep Features for CBIR with Scarce Data using Hebbian Learning

Features extracted from Deep Neural Networks (DNNs) have proven to be ve...

Please sign up or login with your details

Forgot password? Click here to reset