Learning Self-Expression Metrics for Scalable and Inductive Subspace Clustering

09/27/2020
by   Julian Busch, et al.
2

Subspace clustering has established itself as a state-of-the-art approach to clustering high-dimensional data. In particular, methods relying on the self-expressiveness property have recently proved especially successful. However, they suffer from two major shortcomings: First, a quadratic-size coefficient matrix is learned directly, preventing these methods from scaling beyond small datasets. Secondly, the trained models are transductive and thus cannot be used to cluster out-of-sample data unseen during training. Instead of learning self-expression coefficients directly, we propose a novel metric learning approach to learn instead a subspace affinity function using a siamese neural network architecture. Consequently, our model benefits from a constant number of parameters and a constant-size memory footprint, allowing it to scale to considerably larger datasets. In addition, we can formally show that out model is still able to exactly recover subspace clusters given an independence assumption. The siamese architecture in combination with a novel geometric classifier further makes our model inductive, allowing it to cluster out-of-sample data. Additionally, non-linear clusters can be detected by simply adding an auto-encoder module to the architecture. The whole model can then be trained end-to-end in a self-supervised manner. This work in progress reports promising preliminary results on the MNIST dataset. In the spirit of reproducible research, me make all code publicly available. In future work we plan to investigate several extensions of our model and to expand experimental evaluation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2023

Deep Double Self-Expressive Subspace Clustering

Deep subspace clustering based on auto-encoder has received wide attenti...
research
08/26/2019

Deep Closed-Form Subspace Clustering

We propose Deep Closed-Form Subspace Clustering (DCFSC), a new embarrass...
research
07/11/2018

Learning Neural Models for End-to-End Clustering

We propose a novel end-to-end neural network architecture that, once tra...
research
12/06/2020

Maximum Entropy Subspace Clustering Network

Deep subspace clustering network (DSC-Net) and its numerous variants hav...
research
10/08/2021

Learning a Self-Expressive Network for Subspace Clustering

State-of-the-art subspace clustering methods are based on self-expressiv...
research
06/10/2022

Self-Supervised Deep Subspace Clustering with Entropy-norm

Auto-Encoder based deep subspace clustering (DSC) is widely used in comp...

Please sign up or login with your details

Forgot password? Click here to reset