Learning Low-Dimensional Metrics

09/18/2017
by   Lalit Jain, et al.
0

This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy of the learned metric relative to the underlying true generative metric. All the results involve novel mathematical approaches to the metric learning problem, and lso shed new light on the special case of ordinal embedding (aka non-metric multidimensional scaling).

READ FULL TEXT
research
06/14/2018

Low-rank geometric mean metric learning

We propose a low-rank approach to learning a Mahalanobis metric from dat...
research
05/11/2015

Sample complexity of learning Mahalanobis distance metrics

Metric learning seeks a transformation of the feature space that enhance...
research
09/08/2023

Perceptual adjustment queries and an inverted measurement paradigm for low-rank metric learning

We introduce a new type of query mechanism for collecting human feedback...
research
02/07/2021

Dimension Free Generalization Bounds for Non Linear Metric Learning

In this work we study generalization guarantees for the metric learning ...
research
02/27/2019

Ordinal Distance Metric Learning with MDS for Image Ranking

Image ranking is to rank images based on some known ranked images. In th...
research
06/05/2023

Linear Distance Metric Learning

In linear distance metric learning, we are given data in one Euclidean m...
research
05/28/2019

Learning Bregman Divergences

Metric learning is the problem of learning a task-specific distance func...

Please sign up or login with your details

Forgot password? Click here to reset