DeepAI AI Chat
Log In Sign Up

Inductive Geometric Matrix Midranges

by   Graham W. Van Goffrier, et al.
University of Cambridge

Covariance data as represented by symmetric positive definite (SPD) matrices are ubiquitous throughout technical study as efficient descriptors of interdependent systems. Euclidean analysis of SPD matrices, while computationally fast, can lead to skewed and even unphysical interpretations of data. Riemannian methods preserve the geometric structure of SPD data at the cost of expensive eigenvalue computations. In this paper, we propose a geometric method for unsupervised clustering of SPD data based on the Thompson metric. This technique relies upon a novel "inductive midrange" centroid computation for SPD data, whose properties are examined and numerically confirmed. We demonstrate the incorporation of the Thompson metric and inductive midrange into X-means and K-means++ clustering algorithms.


page 1

page 2

page 3

page 4


k-means on a log-Cholesky Manifold, with Unsupervised Classification of Radar Products

We state theoretical properties for k-means clustering of Symmetric Posi...

Inference for partially observed Riemannian Ornstein–Uhlenbeck diffusions of covariance matrices

We construct a generalization of the Ornstein–Uhlenbeck processes on the...

Positive definite matrices and the S-divergence

Positive definite matrices abound in a dazzling variety of applications....

Robust Geometric Metric Learning

This paper proposes new algorithms for the metric learning problem. We s...

Approximate Joint Diagonalization and Geometric Mean of Symmetric Positive Definite Matrices

We explore the connection between two problems that have arisen independ...

Differential geometry with extreme eigenvalues in the positive semidefinite cone

Differential geometric approaches to the analysis and processing of data...

Geometry-aware Dynamic Movement Primitives

In many robot control problems, factors such as stiffness and damping ma...