DeepAI AI Chat
Log In Sign Up

Parallel Total Variation Distance Estimation with Neural Networks for Merging Over-Clusterings

by   Christian Reiser, et al.
Universität Passau

We consider the initial situation where a dataset has been over-partitioned into k clusters and seek a domain independent way to merge those initial clusters. We identify the total variation distance (TVD) as suitable for this goal. By exploiting the relation of the TVD to the Bayes accuracy we show how neural networks can be used to estimate TVDs between all pairs of clusters in parallel. Crucially, the needed memory space is decreased by reducing the required number of output neurons from k^2 to k. On realistically obtained over-clusterings of ImageNet subsets it is demonstrated that our TVD estimates lead to better merge decisions than those obtained by relying on state-of-the-art unsupervised representations. Further the generality of the approach is verified by evaluating it on a a point cloud dataset.


page 1

page 2

page 3

page 4


Continuum limit of total variation on point clouds

We consider point clouds obtained as random samples of a measure on a Eu...

Variational limits of k-NN graph based functionals on data clouds

We consider i.i.d. samples x_1, ..., x_n from a measure ν with density s...

Total variation distance between a jump-equation and its Gaussian approximation

We deal with stochastic differential equations with jumps. In order to o...

Discrete MDL Predicts in Total Variation

The Minimum Description Length (MDL) principle selects the model that ha...

Sample Amplification: Increasing Dataset Size even when Learning is Impossible

Given data drawn from an unknown distribution, D, to what extent is it p...

Total Variation Diminishing (TVD) method for Elastohydrodynamic Lubrication (EHL) problem on Parallel Computers

In this article, we offer a novel numerical approach for the solution of...