Riemannian statistics meets random matrix theory: towards learning from high-dimensional covariance matrices

03/01/2022
by   Salem Said, et al.
0

Riemannian Gaussian distributions were initially introduced as basic building blocks for learning models which aim to capture the intrinsic structure of statistical populations of positive-definite matrices (here called covariance matrices). While the potential applications of such models have attracted significant attention, a major obstacle still stands in the way of these applications: there seems to exist no practical method of computing the normalising factors associated with Riemannian Gaussian distributions on spaces of high-dimensional covariance matrices. The present paper shows that this missing method comes from an unexpected new connection with random matrix theory. Its main contribution is to prove that Riemannian Gaussian distributions of real, complex, or quaternion covariance matrices are equivalent to orthogonal, unitary, or symplectic log-normal matrix ensembles. This equivalence yields a highly efficient approximation of the normalising factors, in terms of a rather simple analytic expression. The error due to this approximation decreases like the inverse square of dimension. Numerical experiments are conducted which demonstrate how this new approximation can unlock the difficulties which have impeded applications to real-world datasets of high-dimensional covariance matrices. The paper then turns to Riemannian Gaussian distributions of block-Toeplitz covariance matrices. These are equivalent to yet another kind of random matrix ensembles, here called "acosh-normal" ensembles. Orthogonal and unitary "acosh-normal" ensembles correspond to the cases of block-Toeplitz with Toeplitz blocks, and block-Toeplitz (with general blocks) covariance matrices, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2020

Riemannian Gaussian distributions, random matrix ensembles and diffusion kernels

We show that the Riemannian Gaussian distributions on symmetric spaces, ...
research
02/15/2021

Gaussian distributions on Riemannian symmetric spaces in the large N limit

We consider Gaussian distributions on certain Riemannian symmetric space...
research
04/17/2023

A New Representation of Uniform-Block Matrix and Applications

A covariance matrix with a special pattern (e.g., sparsity or block stru...
research
11/29/2017

Intrinsic Analysis of the Sample Fréchet Mean and Sample Mean of Complex Wishart Matrices

We consider two types of averaging of complex covariance matrices, a sam...
research
02/07/2018

Group kernels for Gaussian process metamodels with categorical inputs

Gaussian processes (GP) are widely used as a metamodel for emulating tim...
research
10/01/2013

Graph connection Laplacian and random matrices with random blocks

Graph connection Laplacian (GCL) is a modern data analysis technique tha...

Please sign up or login with your details

Forgot password? Click here to reset