Asymptotic normality and optimalities in estimation of large Gaussian graphical models

09/24/2013
by   Zhao Ren, et al.
0

The Gaussian graphical model, a popular paradigm for studying relationship among variables in a wide range of applications, has attracted great attention in recent years. This paper considers a fundamental question: When is it possible to estimate low-dimensional parameters at parametric square-root rate in a large Gaussian graphical model? A novel regression approach is proposed to obtain asymptotically efficient estimation of each entry of a precision matrix under a sparseness condition relative to the sample size. When the precision matrix is not sufficiently sparse, or equivalently the sample size is not sufficiently large, a lower bound is established to show that it is no longer possible to achieve the parametric rate in the estimation of each entry. This lower bound result, which provides an answer to the delicate sample size question, is established with a novel construction of a subset of sparse precision matrices in an application of Le Cam's lemma. Moreover, the proposed estimator is proven to have optimal convergence rate when the parametric rate cannot be achieved, under a minimal sample requirement. The proposed estimator is applied to test the presence of an edge in the Gaussian graphical model or to recover the support of the entire model, to obtain adaptive rate-optimal estimation of the entire precision matrix as measured by the matrix ℓ_q operator norm and to make inference in latent variables in the graphical model. All of this is achieved under a sparsity condition on the precision matrix and a side condition on the range of its spectrum. This significantly relaxes the commonly imposed uniform signal strength condition on the precision matrix, irrepresentability condition on the Hessian tensor operator of the covariance matrix or the ℓ_1 constraint on the precision matrix. Numerical results confirm our theoretical findings. The ROC curve of the proposed algorithm, Asymptotic Normal Thresholding (ANT), for support recovery significantly outperforms that of the popular GLasso algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2021

High-dimensional Precision Matrix Estimation with a Known Graphical Structure

A precision matrix is the inverse of a covariance matrix. In this paper,...
research
08/04/2023

Learning Networks from Gaussian Graphical Models and Gaussian Free Fields

We investigate the problem of estimating the structure of a weighted net...
research
04/06/2016

A U-statistic Approach to Hypothesis Testing for Structure Discovery in Undirected Graphical Models

Structure discovery in graphical models is the determination of the topo...
research
11/12/2015

Block-diagonal covariance selection for high-dimensional Gaussian graphical models

Gaussian graphical models are widely utilized to infer and visualize net...
research
09/15/2023

High-Dimensional Bernstein Von-Mises Theorems for Covariance and Precision Matrices

This paper aims to examine the characteristics of the posterior distribu...
research
03/28/2022

An efficient GPU-Parallel Coordinate Descent Algorithm for Sparse Precision Matrix Estimation via Scaled Lasso

The sparse precision matrix plays an essential role in the Gaussian grap...
research
06/16/2021

Breaking The Dimension Dependence in Sparse Distribution Estimation under Communication Constraints

We consider the problem of estimating a d-dimensional s-sparse discrete ...

Please sign up or login with your details

Forgot password? Click here to reset