On model misspecification and KL separation for Gaussian graphical models

01/10/2015
by   Varun Jog, et al.
0

We establish bounds on the KL divergence between two multivariate Gaussian distributions in terms of the Hamming distance between the edge sets of the corresponding graphical models. We show that the KL divergence is bounded below by a constant when the graphs differ by at least one edge; this is essentially the tightest possible bound, since classes of graphs exist for which the edge discrepancy increases but the KL divergence remains bounded above by a constant. As a natural corollary to our KL lower bound, we also establish a sample size requirement for correct model selection via maximum likelihood estimation. Our results rigorize the notion that it is essential to estimate the edge structure of a Gaussian graphical model accurately in order to approximate the true distribution to close precision.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2021

On the Properties of Kullback-Leibler Divergence Between Gaussians

Kullback-Leibler (KL) divergence is one of the most important divergence...
research
03/11/2020

Gaussian Graphical Model exploration and selection in high dimension low sample size setting

Gaussian Graphical Models (GGM) are often used to describe the condition...
research
05/18/2016

The Quality of the Covariance Selection Through Detection Problem and AUC Bounds

We consider the problem of quantifying the quality of a model selection ...
research
11/11/2021

Causal KL: Evaluating Causal Discovery

The two most commonly used criteria for assessing causal model discovery...
research
09/15/2021

How to use KL-divergence to construct conjugate priors, with well-defined non-informative limits, for the multivariate Gaussian

The Wishart distribution is the standard conjugate prior for the precisi...
research
03/12/2019

The All-or-Nothing Phenomenon in Sparse Linear Regression

We study the problem of recovering a hidden binary k-sparse p-dimensiona...
research
04/17/2019

Samplers and extractors for unbounded functions

Blasiok (SODA'18) recently introduced the notion of a subgaussian sample...

Please sign up or login with your details

Forgot password? Click here to reset