Asymptotic normality and optimalities in estimation of large Gaussian graphical models

by   Zhao Ren, et al.

The Gaussian graphical model, a popular paradigm for studying relationship among variables in a wide range of applications, has attracted great attention in recent years. This paper considers a fundamental question: When is it possible to estimate low-dimensional parameters at parametric square-root rate in a large Gaussian graphical model? A novel regression approach is proposed to obtain asymptotically efficient estimation of each entry of a precision matrix under a sparseness condition relative to the sample size. When the precision matrix is not sufficiently sparse, or equivalently the sample size is not sufficiently large, a lower bound is established to show that it is no longer possible to achieve the parametric rate in the estimation of each entry. This lower bound result, which provides an answer to the delicate sample size question, is established with a novel construction of a subset of sparse precision matrices in an application of Le Cam's lemma. Moreover, the proposed estimator is proven to have optimal convergence rate when the parametric rate cannot be achieved, under a minimal sample requirement. The proposed estimator is applied to test the presence of an edge in the Gaussian graphical model or to recover the support of the entire model, to obtain adaptive rate-optimal estimation of the entire precision matrix as measured by the matrix ℓ_q operator norm and to make inference in latent variables in the graphical model. All of this is achieved under a sparsity condition on the precision matrix and a side condition on the range of its spectrum. This significantly relaxes the commonly imposed uniform signal strength condition on the precision matrix, irrepresentability condition on the Hessian tensor operator of the covariance matrix or the ℓ_1 constraint on the precision matrix. Numerical results confirm our theoretical findings. The ROC curve of the proposed algorithm, Asymptotic Normal Thresholding (ANT), for support recovery significantly outperforms that of the popular GLasso algorithm.


page 1

page 2

page 3

page 4


High-dimensional Precision Matrix Estimation with a Known Graphical Structure

A precision matrix is the inverse of a covariance matrix. In this paper,...

Learning Networks from Gaussian Graphical Models and Gaussian Free Fields

We investigate the problem of estimating the structure of a weighted net...

A U-statistic Approach to Hypothesis Testing for Structure Discovery in Undirected Graphical Models

Structure discovery in graphical models is the determination of the topo...

Block-diagonal covariance selection for high-dimensional Gaussian graphical models

Gaussian graphical models are widely utilized to infer and visualize net...

High-Dimensional Bernstein Von-Mises Theorems for Covariance and Precision Matrices

This paper aims to examine the characteristics of the posterior distribu...

An efficient GPU-Parallel Coordinate Descent Algorithm for Sparse Precision Matrix Estimation via Scaled Lasso

The sparse precision matrix plays an essential role in the Gaussian grap...

Breaking The Dimension Dependence in Sparse Distribution Estimation under Communication Constraints

We consider the problem of estimating a d-dimensional s-sparse discrete ...

Please sign up or login with your details

Forgot password? Click here to reset