ContraCluster: Learning to Classify without Labels by Contrastive Self-Supervision and Prototype-Based Semi-Supervision

04/19/2023
by   Seongho Joe, et al.
0

The recent advances in representation learning inspire us to take on the challenging problem of unsupervised image classification tasks in a principled way. We propose ContraCluster, an unsupervised image classification method that combines clustering with the power of contrastive self-supervised learning. ContraCluster consists of three stages: (1) contrastive self-supervised pre-training (CPT), (2) contrastive prototype sampling (CPS), and (3) prototype-based semi-supervised fine-tuning (PB-SFT). CPS can select highly accurate, categorically prototypical images in an embedding space learned by contrastive learning. We use sampled prototypes as noisy labeled data to perform semi-supervised fine-tuning (PB-SFT), leveraging small prototypes and large unlabeled data to further enhance the accuracy. We demonstrate empirically that ContraCluster achieves new state-of-the-art results for standard benchmark datasets including CIFAR-10, STL-10, and ImageNet-10. For example, ContraCluster achieves about 90.8 outperforms DAC (52.2 Without any labels, ContraCluster can achieve a 90.8 comparable to 95.8

READ FULL TEXT

page 1

page 2

research
01/16/2021

SelfMatch: Combining Contrastive Self-Supervision and Consistency for Semi-Supervised Learning

This paper introduces SelfMatch, a semi-supervised learning method that ...
research
06/13/2020

Adversarial Self-Supervised Contrastive Learning

Existing adversarial learning approaches mostly use class labels to gene...
research
07/08/2022

Sudowoodo: Contrastive Self-supervised Learning for Multi-purpose Data Integration and Preparation

Machine learning (ML) is playing an increasingly important role in data ...
research
08/08/2022

Self-Supervised Contrastive Representation Learning for 3D Mesh Segmentation

3D deep learning is a growing field of interest due to the vast amount o...
research
06/21/2022

Few-Max: Few-Shot Domain Adaptation for Unsupervised Contrastive Representation Learning

Contrastive self-supervised learning methods learn to map data points su...
research
11/02/2022

RegCLR: A Self-Supervised Framework for Tabular Representation Learning in the Wild

Recent advances in self-supervised learning (SSL) using large models to ...
research
11/17/2020

Dual-stream Multiple Instance Learning Network for Whole Slide Image Classification with Self-supervised Contrastive Learning

Whole slide images (WSIs) have large resolutions and usually lack locali...

Please sign up or login with your details

Forgot password? Click here to reset