k-meansNet: When k-means Meets Differentiable Programming

08/22/2018
by   Xi Peng, et al.
0

In this paper, we study how to make clustering benefiting from differentiable programming whose basic idea is treating the neural network as a language instead of a machine learning method. To this end, we recast the vanilla k-means as a novel feedforward neural network in an elegant way. Our contribution is two-fold. On the one hand, the proposed k-meansNet is a neural network implementation of the vanilla k-means, which enjoys four advantages highly desired, i.e., robustness to initialization, fast inference speed, the capability of handling new coming data, and provable convergence. On the other hand, this work may provide novel insights into differentiable programming. More specifically, most existing differentiable programming works unroll an optimizer as a recurrent neural network, namely, the neural network is employed to solve an existing optimization problem. In contrast, we reformulate the objective function of k-means as a feedforward neural network, namely, we employ the neural network to describe a problem. In such a way, we advance the boundary of differentiable programming by treating the neural network as from an alternative optimization approach to the problem formulation. Extensive experimental studies show that our method achieves promising performance comparing with 12 clustering methods on some challenging datasets.

READ FULL TEXT
research
12/31/2020

Differentiable Programming à la Moreau

The notion of a Moreau envelope is central to the analysis of first-orde...
research
10/01/2019

DiffTaichi: Differentiable Programming for Physical Simulation

We study the problem of learning and optimizing through physical simulat...
research
01/10/2020

Probabilistic K-means Clustering via Nonlinear Programming

K-means is a classical clustering algorithm with wide applications. Howe...
research
12/25/2020

Neural Network Training With Homomorphic Encryption

We introduce a novel method and implementation architecture to train neu...
research
06/05/2023

End-to-end Differentiable Clustering with Associative Memories

Clustering is a widely used unsupervised learning technique involving an...
research
10/17/2019

Deep clustering with concrete k-means

We address the problem of simultaneously learning a k-means clustering a...
research
03/27/2019

A Multi Hidden Recurrent Neural Network with a Modified Grey Wolf Optimizer

Identifying university students' weaknesses results in better learning a...

Please sign up or login with your details

Forgot password? Click here to reset