Stochastic Backward Euler: An Implicit Gradient Descent Algorithm for k-means Clustering

10/21/2017
by   Penghang Yin, et al.
0

In this paper, we propose an implicit gradient descent algorithm for the classic k-means problem. The implicit gradient step or backward Euler is solved via stochastic fixed-point iteration, in which we randomly sample a mini-batch gradient in every iteration. It is the average of the fixed-point trajectory that is carried over to the next gradient step. We draw connections between the proposed stochastic backward Euler and the recent entropy stochastic gradient descent (Entropy-SGD) for improving the training of deep neural networks. Numerical experiments on various synthetic and real datasets show that the proposed algorithm finds the global minimum (or its neighborhood) with high probability, when given the correct number of clusters. The method provides better clustering results compared to k-means algorithms in the sense that it decreased the objective function (the cluster) and is much more robust to initialization.

READ FULL TEXT

page 9

page 12

research
06/29/2017

A Fixed-Point of View on Gradient Methods for Big Data

Interpreting gradient methods as fixed-point iterations, we provide a de...
research
07/08/2016

Overcoming Challenges in Fixed Point Training of Deep Convolutional Networks

It is known that training deep neural networks, in particular, deep conv...
research
02/18/2023

Parameter Averaging for SGD Stabilizes the Implicit Bias towards Flat Regions

Stochastic gradient descent is a workhorse for training deep neural netw...
research
02/10/2020

Semi-Implicit Back Propagation

Neural network has attracted great attention for a long time and many re...
research
11/09/2022

Approximate backwards differentiation of gradient flow

The gradient flow (GF) is an ODE for which its explicit Euler's discreti...
research
05/29/2021

A Stochastic Alternating Balance k-Means Algorithm for Fair Clustering

In the application of data clustering to human-centric decision-making s...
research
12/13/2018

Stochastic Gradient Descent for Spectral Embedding with Implicit Orthogonality Constraint

In this paper, we propose a scalable algorithm for spectral embedding. T...

Please sign up or login with your details

Forgot password? Click here to reset