RECAL: Reuse of Established CNN classifer Apropos unsupervised Learning paradigm

06/15/2019
by   Jayasree Saha, et al.
1

Recently, clustering with deep network framework has attracted attention of several researchers in the computer vision community. Deep framework gains extensive attention due to its efficiency and scalability towards large-scale and high-dimensional data. In this paper, we transform supervised CNN classifier architecture into an unsupervised clustering model, called RECAL, which jointly learns discriminative embedding subspace and cluster labels. RECAL is made up of feature extraction layers which are convolutional, followed by unsupervised classifier layers which is fully connected. A multinomial logistic regression function (softmax) stacked on top of classifier layers. We train this network using stochastic gradient descent (SGD) optimizer. However, the successful implementation of our model is revolved around the design of loss function. Our loss function uses the heuristics that true partitioning entails lower entropy given that the class distribution is not heavily skewed. This is a trade-off between the situations of "skewed distribution" and "low-entropy". To handle this, we have proposed classification entropy and class entropy which are the two components of our loss function. In this approach, size of the mini-batch should be kept high. Experimental results indicate the consistent and competitive behavior of our model for clustering well-known digit, multi-viewed object and face datasets. Morever, we use this model to generate unsupervised patch segmentation for multi-spectral LISS-IV images. We observe that it is able to distinguish built-up area, wet land, vegetation and waterbody from the underlying scene.

READ FULL TEXT

page 6

page 7

page 8

research
06/06/2017

Deep Convolutional Decision Jungle for Image Classification

We propose a novel method called deep convolutional decision jungle (CDJ...
research
11/23/2022

Learning Compact Features via In-Training Representation Alignment

Deep neural networks (DNNs) for supervised learning can be viewed as a p...
research
02/23/2020

Improve SGD Training via Aligning Mini-batches

Deep neural networks (DNNs) for supervised learning can be viewed as a p...
research
02/23/2020

Improve SGD Training via Aligning Min-batches

Deep neural networks (DNNs) for supervised learning can be viewed as a p...
research
10/20/2019

Differentiable Deep Clustering with Cluster Size Constraints

Clustering is a fundamental unsupervised learning approach. Many cluster...
research
06/15/2018

Supervised Fuzzy Partitioning

Centroid-based methods including k-means and fuzzy c-means (FCM) are kno...
research
04/15/2020

Towards a theory of machine learning

We define a neural network as a septuple consisting of (1) a state vecto...

Please sign up or login with your details

Forgot password? Click here to reset