Supervised Learning of Labeled Pointcloud Differences via Cover-Tree Entropy Reduction

02/26/2017
by   Abraham Smith, et al.
0

We introduce a new algorithm, called CDER, for supervised machine learning that merges the multi-scale geometric properties of Cover Trees with the information-theoretic properties of entropy. CDER applies to a training set of labeled pointclouds embedded in a common Euclidean space. If typical pointclouds corresponding to distinct labels tend to differ at any scale in any sub-region, CDER can identify these differences in (typically) linear time, creating a set of distributional coordinates which act as a feature extraction mechanism for supervised learning. We describe theoretical properties and implementation details of CDER, and illustrate its benefits on several synthetic examples.

READ FULL TEXT
research
04/19/2023

To Compress or Not to Compress- Self-Supervised Learning and Information Theory: A Review

Deep neural networks have demonstrated remarkable performance in supervi...
research
07/07/2020

Information-theoretic convergence of extreme values to the Gumbel distribution

We show how convergence to the Gumbel distribution in an extreme value s...
research
08/05/2019

Elements of Generalized Tsallis Relative Entropy in Classical Information Theory

In this article, we propose a modification in generalised Tsallis entrop...
research
05/28/2016

Muffled Semi-Supervised Learning

We explore a novel approach to semi-supervised learning. This approach i...
research
02/01/2011

Information-theoretic measures associated with rough set approximations

Although some information-theoretic measures of uncertainty or granulari...
research
01/07/2019

Semi-supervised learning in unbalanced and heterogeneous networks

Community detection was a hot topic on network analysis, where the main ...

Please sign up or login with your details

Forgot password? Click here to reset