New bounds for k-means and information k-means

01/14/2021
by   Gautier Appert, et al.
0

In this paper, we derive a new dimension-free non-asymptotic upper bound for the quadratic k-means excess risk related to the quantization of an i.i.d sample in a separable Hilbert space. We improve the bound of order 𝒪( k / √(n)) of Biau, Devroye and Lugosi, recovering the rate √(k/n) that has already been proved by Fefferman, Mitter, and Narayanan and by Klochkov, Kroshnin and Zhivotovskiy but with worse log factors and constants. More precisely, we bound the mean excess risk of an empirical minimizer by the explicit upper bound 16 B^2 log(n/k) √(k log(k) / n), in the bounded case when ℙ( ‖ X ‖≤ B) = 1. This is essentially optimal up to logarithmic factors since a lower bound of order 𝒪( √(k^1 - 4/d/n)) is known in dimension d. Our technique of proof is based on the linearization of the k-means criterion through a kernel trick and on PAC-Bayesian inequalities. To get a 1 / √(n) speed, we introduce a new PAC-Bayesian chaining method replacing the concept of δ-net with the perturbation of the parameter by an infinite dimensional Gaussian process. In the meantime, we embed the usual k-means criterion into a broader family built upon the Kullback divergence and its underlying properties. This results in a new algorithm that we named information k-means, well suited to the clustering of bags of words. Based on considerations from information theory, we also introduce a new bounded k-means criterion that uses a scale parameter but satisfies a generalization bound that does not require any boundedness or even integrability conditions on the sample. We describe the counterpart of Lloyd's algorithm and prove generalization bounds for these new k-means criteria.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2018

Dimension-free PAC-Bayesian bounds for the estimation of the mean of a random vector

In this paper, we present a new estimator of the mean of a random vector...
research
02/28/2012

PAC-Bayesian Generalization Bound on Confusion Matrix for Multi-Class Classification

In this work, we propose a PAC-Bayes bound for the generalization risk o...
research
10/21/2017

A Tight Excess Risk Bound via a Unified PAC-Bayesian-Rademacher-Shtarkov-MDL Complexity

We present a novel notion of complexity that interpolates between and ge...
research
07/16/2022

A Nearly Tight Analysis of Greedy k-means++

The famous k-means++ algorithm of Arthur and Vassilvitskii [SODA 2007] i...
research
05/15/2020

Stopping criterion for active learning based on deterministic generalization bounds

Active learning is a framework in which the learning machine can select ...
research
10/20/2022

Tighter PAC-Bayes Generalisation Bounds by Leveraging Example Difficulty

We introduce a modified version of the excess risk, which can be used to...
research
05/27/2011

PAC learnability under non-atomic measures: a problem by Vidyasagar

In response to a 1997 problem of M. Vidyasagar, we state a criterion for...

Please sign up or login with your details

Forgot password? Click here to reset