Domain-Agnostic Clustering with Self-Distillation

11/23/2021
by   Mohammed Adnan, et al.
6

Recent advancements in self-supervised learning have reduced the gap between supervised and unsupervised representation learning. However, most self-supervised and deep clustering techniques rely heavily on data augmentation, rendering them ineffective for many learning tasks where insufficient domain knowledge exists for performing augmentation. We propose a new self-distillation based algorithm for domain-agnostic clustering. Our method builds upon the existing deep clustering frameworks and requires no separate student model. The proposed method outperforms existing domain agnostic (augmentation-free) algorithms on CIFAR-10. We empirically demonstrate that knowledge distillation can improve unsupervised representation learning by extracting richer `dark knowledge' from the model than using predicted labels alone. Preliminary experiments also suggest that self-distillation improves the convergence of DeepCluster-v2.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2023

Multi-Mode Online Knowledge Distillation for Self-Supervised Visual Representation Learning

Self-supervised learning (SSL) has made remarkable progress in visual re...
research
07/06/2023

On-Device Constrained Self-Supervised Speech Representation Learning for Keyword Spotting via Knowledge Distillation

Large self-supervised models are effective feature extractors, but their...
research
05/22/2023

EnSiam: Self-Supervised Learning With Ensemble Representations

Recently, contrastive self-supervised learning, where the proximity of r...
research
11/09/2020

Towards Domain-Agnostic Contrastive Learning

Despite recent success, most contrastive self-supervised learning method...
research
08/20/2022

Looking For A Match: Self-supervised Clustering For Automatic Doubt Matching In e-learning Platforms

Recently, e-learning platforms have grown as a place where students can ...
research
05/17/2023

DinoSR: Self-Distillation and Online Clustering for Self-supervised Speech Representation Learning

In this paper, we introduce self-distillation and online clustering for ...
research
02/26/2020

A Comprehensive Approach to Unsupervised Embedding Learning based on AND Algorithm

Unsupervised embedding learning aims to extract good representation from...

Please sign up or login with your details

Forgot password? Click here to reset