Exploring Non-Contrastive Representation Learning for Deep Clustering

11/23/2021
by   Zhizhong Huang, et al.
6

Existing deep clustering methods rely on contrastive learning for representation learning, which requires negative examples to form an embedding space where all instances are well-separated. However, the negative examples inevitably give rise to the class collision issue, compromising the representation learning for clustering. In this paper, we explore non-contrastive representation learning for deep clustering, termed NCC, which is based on BYOL, a representative method without negative examples. First, we propose to align one augmented view of instance with the neighbors of another view in the embedding space, called positive sampling strategy, which avoids the class collision issue caused by the negative examples and hence improves the within-cluster compactness. Second, we propose to encourage alignment between two augmented views of one prototype and uniformity among all prototypes, named prototypical contrastive loss or ProtoCL, which can maximize the inter-cluster distance. Moreover, we formulate NCC in an Expectation-Maximization (EM) framework, in which E-step utilizes spherical k-means to estimate the pseudo-labels of instances and distribution of prototypes from a target network and M-step leverages the proposed losses to optimize an online network. As a result, NCC forms an embedding space where all clusters are well-separated and within-cluster examples are compact. Experimental results on several clustering benchmark datasets including ImageNet-1K demonstrate that NCC outperforms the state-of-the-art methods by a significant margin.

READ FULL TEXT

page 20

page 21

page 22

page 24

research
09/30/2021

Deep Embedded K-Means Clustering

Recently, deep clustering methods have gained momentum because of the hi...
research
12/30/2021

Contrastive Fine-grained Class Clustering via Generative Adversarial Networks

Unsupervised fine-grained class clustering is practical yet challenging ...
research
05/11/2020

Prototypical Contrastive Learning of Unsupervised Representations

This paper presents Prototypical Contrastive Learning (PCL), an unsuperv...
research
06/28/2023

Semantic Positive Pairs for Enhancing Contrastive Instance Discrimination

Self-supervised learning algorithms based on instance discrimination eff...
research
03/03/2021

Deep Clustering by Semantic Contrastive Learning

Whilst contrastive learning has achieved remarkable success in self-supe...
research
07/13/2021

Speech Representation Learning Combining Conformer CPC with Deep Cluster for the ZeroSpeech Challenge 2021

We present a system for the Zero Resource Speech Challenge 2021, which c...
research
07/18/2020

MIX'EM: Unsupervised Image Classification using a Mixture of Embeddings

We present MIX'EM, a novel solution for unsupervised image classificatio...

Please sign up or login with your details

Forgot password? Click here to reset