High-Dimensional Covariance Decomposition into Sparse Markov and Independence Domains

06/27/2012
by   Majid Janzamin, et al.
0

In this paper, we present a novel framework incorporating a combination of sparse models in different domains. We posit the observed data as generated from a linear combination of a sparse Gaussian Markov model (with a sparse precision matrix) and a sparse Gaussian independence model (with a sparse covariance matrix). We provide efficient methods for decomposition of the data into two domains, Markov and independence domains. We characterize a set of sufficient conditions for identifiability and model consistency. Our decomposition method is based on a simple modification of the popular ℓ_1-penalized maximum-likelihood estimator (ℓ_1-MLE). We establish that our estimator is consistent in both the domains, i.e., it successfully recovers the supports of both Markov and independence models, when the number of samples n scales as n = Ω(d^2 p), where p is the number of variables and d is the maximum node degree in the Markov model. Our conditions for recovery are comparable to those of ℓ_1-MLE for consistent estimation of a sparse Markov model, and thus, we guarantee successful high-dimensional estimation of a richer class of models under comparable conditions. Our experiments validate these results and also demonstrate that our models have better inference accuracy under simple algorithms such as loopy belief propagation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2012

High-Dimensional Covariance Decomposition into Sparse Markov and Independence Models

Fitting high-dimensional data involves a delicate tradeoff between faith...
research
09/02/2010

High-dimensional covariance estimation based on Gaussian graphical models

Undirected graphs are often used to describe high dimensional distributi...
research
04/03/2012

Convergence Properties of Kronecker Graphical Lasso Algorithms

This paper studies iteration convergence of Kronecker graphical lasso (K...
research
06/30/2020

Hierarchical sparse Cholesky decomposition with applications to high-dimensional spatio-temporal filtering

Spatial statistics often involves Cholesky decomposition of covariance m...
research
02/25/2022

High-Dimensional Sparse Bayesian Learning without Covariance Matrices

Sparse Bayesian learning (SBL) is a powerful framework for tackling the ...
research
11/25/2022

High-Dimensional Causal Discovery: Learning from Inverse Covariance via Independence-based Decomposition

Inferring causal relationships from observational data is a fundamental ...
research
11/14/2013

High-dimensional learning of linear causal networks via inverse covariance estimation

We establish a new framework for statistical estimation of directed acyc...

Please sign up or login with your details

Forgot password? Click here to reset