Low Complexity Gaussian Latent Factor Models and a Blessing of Dimensionality

06/11/2017
by   Greg Ver Steeg, et al.
0

Learning the structure of graphical models from data is a fundamental problem that typically carries a curse of dimensionality. We consider a special class of Gaussian latent factor models where each observed variable depends on at most one of a set of latent variables. We derive information-theoretic lower bounds on the sample complexity for structure recovery that suggest a blessing of dimensionality. With a fixed number of samples, structure recovery for this class using existing methods deteriorates with increasing dimension. We design a new approach to learning Gaussian latent factor models with low computational complexity that empirically benefits from dimensionality. Our approach relies on an information-theoretic constraint to find parsimonious solutions without adding regularizers or sparsity hyper-parameters. Besides improved structure recovery, we also show that we are able to outperform state-of-the-art approaches for covariance estimation on both synthetic data and on under-sampled, high-dimensional stock market data.

READ FULL TEXT
research
06/27/2017

Fast Algorithms for Learning Latent Variables in Graphical Models

We study the problem of learning latent variables in Gaussian graphical ...
research
07/07/2011

Spectral Methods for Learning Multivariate Latent Tree Structure

This work considers the problem of learning the structure of multivariat...
research
07/18/2012

On the Statistical Efficiency of ℓ_1,p Multi-Task Learning of Gaussian Graphical Models

In this paper, we present ℓ_1,p multi-task structure learning for Gaussi...
research
11/07/2020

Limits on Testing Structural Changes in Ising Models

We present novel information-theoretic limits on detecting sparse change...
research
02/12/2018

Region Detection in Markov Random Fields: Gaussian Case

In this work we consider the problem of model selection in Gaussian Mark...
research
11/12/2013

The More, the Merrier: the Blessing of Dimensionality for Learning Large Gaussian Mixtures

In this paper we show that very large mixtures of Gaussians are efficien...
research
12/12/2012

Dimension Correction for Hierarchical Latent Class Models

Model complexity is an important factor to consider when selecting among...

Please sign up or login with your details

Forgot password? Click here to reset