The algorithm of noisy k-means

08/15/2013
by   Camille Brunet, et al.
0

In this note, we introduce a new algorithm to deal with finite dimensional clustering with errors in variables. The design of this algorithm is based on recent theoretical advances (see Loustau (2013a,b)) in statistical learning with errors in variables. As the previous mentioned papers, the algorithm mixes different tools from the inverse problem literature and the machine learning community. Coarsely, it is based on a two-step procedure: (1) a deconvolution step to deal with noisy inputs and (2) Newton's iterations as the popular k-means.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2013

Anisotropic oracle inequalities in noisy quantization

The effect of errors in variables in quantization is investigated. We pr...
research
06/10/2013

Adaptive Noisy Clustering

The problem of adaptive noisy clustering is investigated. Given a set of...
research
01/25/2017

Fast Exact k-Means, k-Medians and Bregman Divergence Clustering in 1D

The k-Means clustering problem on n points is NP-Hard for any dimension ...
research
10/18/2021

Regression with Missing Data, a Comparison Study of TechniquesBased on Random Forests

In this paper we present the practical benefits of a new random forest a...
research
02/20/2022

Analytic continuation from limited noisy Matsubara data

This note proposes a new algorithm for estimating spectral function from...
research
06/28/2020

Breathing k-Means

We propose a new algorithm for the k-means problem which repeatedly incr...

Please sign up or login with your details

Forgot password? Click here to reset