Large Margin Nearest Neighbor Classification using Curved Mahalanobis Distances

09/22/2016
by   Frank Nielsen, et al.
0

We consider the supervised classification problem of machine learning in Cayley-Klein projective geometries: We show how to learn a curved Mahalanobis metric distance corresponding to either the hyperbolic geometry or the elliptic geometry using the Large Margin Nearest Neighbor (LMNN) framework. We report on our experimental results, and further consider the case of learning a mixed curved Mahalanobis distance. Besides, we show that the Cayley-Klein Voronoi diagrams are affine, and can be built from an equivalent (clipped) power diagrams, and that Cayley-Klein balls have Mahalanobis shapes with displaced centers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2017

Intrinsic Metrics: Nearest Neighbor and Edge Squared Distances

Some researchers have proposed using non-Euclidean metrics for clusterin...
research
11/15/2021

Margin-Independent Online Multiclass Learning via Convex Geometry

We consider the problem of multi-class classification, where a stream of...
research
04/07/2015

Large Margin Nearest Neighbor Embedding for Knowledge Representation

Traditional way of storing facts in triplets ( head_entity, relation, ta...
research
06/09/2009

Large-Margin kNN Classification Using a Deep Encoder Network

KNN is one of the most popular classification methods, but it often fail...
research
05/31/2017

Superhuman Accuracy on the SNEMI3D Connectomics Challenge

For the past decade, convolutional networks have been used for 3D recons...

Please sign up or login with your details

Forgot password? Click here to reset