Riemannian Metric Learning for Symmetric Positive Definite Matrices

by   Raviteja Vemulapalli, et al.

Over the past few years, symmetric positive definite (SPD) matrices have been receiving considerable attention from computer vision community. Though various distance measures have been proposed in the past for comparing SPD matrices, the two most widely-used measures are affine-invariant distance and log-Euclidean distance. This is because these two measures are true geodesic distances induced by Riemannian geometry. In this work, we focus on the log-Euclidean Riemannian geometry and propose a data-driven approach for learning Riemannian metrics/geodesic distances for SPD matrices. We show that the geodesic distance learned using the proposed approach performs better than various existing distance measures when evaluated on face matching and clustering tasks.


page 1

page 2

page 3

page 4


Riemannian Geometry of Symmetric Positive Definite Matrices via Cholesky Decomposition

We present a new Riemannian metric, termed Log-Cholesky metric, on the m...

Log-Euclidean Signatures for Intrinsic Distances Between Unaligned Datasets

The need for efficiently comparing and representing datasets with unknow...

Robust Geometric Metric Learning

This paper proposes new algorithms for the metric learning problem. We s...

Prediction in Riemannian metrics derived from divergence functions

Divergence functions are interesting discrepancy measures. Even though t...

Geometric Mean Metric Learning

We revisit the task of learning a Euclidean metric from data. We approac...

R-Mixup: Riemannian Mixup for Biological Networks

Biological networks are commonly used in biomedical and healthcare domai...

Barycenter Estimation of Positive Semi-Definite Matrices with Bures-Wasserstein Distance

Brain-computer interface (BCI) builds a bridge between human brain and e...

Please sign up or login with your details

Forgot password? Click here to reset