A Deep Dive into Deep Cluster

07/24/2022
by   Ahmad Mustapha, et al.
0

Deep Learning has demonstrated a significant improvement against traditional machine learning approaches in different domains such as image and speech recognition. Their success on benchmark datasets is transferred to the real-world through pretrained models by practitioners. Pretraining visual models using supervised learning requires a significant amount of expensive data annotation. To tackle this limitation, DeepCluster - a simple and scalable unsupervised pretraining of visual representations - has been proposed. However, the underlying work of the model is not yet well understood. In this paper, we analyze DeepCluster internals and exhaustively evaluate the impact of various hyperparameters over a wide range of values on three different datasets. Accordingly, we propose an explanation of why the algorithm works in practice. We also show that DeepCluster convergence and performance highly depend on the interplay between the quality of the randomly initialized filters of the convolutional layer and the selected number of clusters. Furthermore, we demonstrate that continuous clustering is not critical for DeepCluster convergence. Therefore, early stopping of the clustering phase will reduce the training time and allow the algorithm to scale to large datasets. Finally, we derive plausible hyperparameter selection criteria in a semi-supervised setting.

READ FULL TEXT

page 2

page 14

research
10/29/2021

Combining Unsupervised and Text Augmented Semi-Supervised Learning for Low Resourced Autoregressive Speech Recognition

Recent advances in unsupervised representation learning have demonstrate...
research
03/29/2022

Investigating Self-supervised Pretraining Frameworks for Pathological Speech Recognition

We investigate the performance of self-supervised pretraining frameworks...
research
07/26/2021

Improve Unsupervised Pretraining for Few-label Transfer

Unsupervised pretraining has achieved great success and many recent work...
research
10/06/2021

Pretraining Reinforcement Learning: Sharpening the Axe Before Cutting the Tree

Pretraining is a common technique in deep learning for increasing perfor...
research
03/01/2022

Semi-supervised Deep Learning for Image Classification with Distribution Mismatch: A Survey

Deep learning methodologies have been employed in several different fiel...
research
05/26/2023

Unleashing the Potential of Unsupervised Deep Outlier Detection through Automated Training Stopping

Outlier detection (OD) has received continuous research interests due to...

Please sign up or login with your details

Forgot password? Click here to reset