Information-theoretic limits of Bayesian network structure learning

01/27/2016
by   Asish Ghoshal, et al.
0

In this paper, we study the information-theoretic limits of learning the structure of Bayesian networks (BNs), on discrete as well as continuous random variables, from a finite number of samples. We show that the minimum number of samples required by any procedure to recover the correct structure grows as Ω(m) and Ω(k m + (k^2/m)) for non-sparse and sparse BNs respectively, where m is the number of variables and k is the maximum number of parents per node. We provide a simple recipe, based on an extension of the Fano's inequality, to obtain information-theoretic limits of structure recovery for any exponential family BN. We instantiate our result for specific conditional distributions in the exponential family to characterize the fundamental limits of learning various commonly used BNs, such as conditional probability table based networks, gaussian BNs, noisy-OR networks, and logistic regression networks. En route to obtaining our main results, we obtain tight bounds on the number of sparse and non-sparse essential-DAGs. Finally, as a byproduct, we recover the information-theoretic limits of sparse variable selection for logistic regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/26/2017

Information Theoretic Limits for Linear Prediction with Graph-Structured Sparsity

We analyze the necessary number of samples for sparse vector recovery in...
research
01/30/2019

Support Recovery in the Phase Retrieval Model: Information-Theoretic Fundamental Limits

The support recovery problem consists of determining a sparse subset of ...
research
01/16/2013

Dynamic Bayesian Multinets

In this work, dynamic Bayesian multinets are introduced where a Markov c...
research
11/16/2018

Information Theoretic Limits for Standard and One-Bit Compressed Sensing with Graph-Structured Sparsity

In this paper, we analyze the information theoretic lower bound on the n...
research
04/02/2013

Sparse Signal Processing with Linear and Nonlinear Observations: A Unified Shannon-Theoretic Approach

We derive fundamental sample complexity bounds for recovering sparse and...
research
09/06/2019

Robust Logistic Regression against Attribute and Label Outliers via Information Theoretic Learning

The framework of information theoretic learning (ITL) has been verified ...
research
06/04/2007

Compressed Regression

Recent research has studied the role of sparsity in high dimensional reg...

Please sign up or login with your details

Forgot password? Click here to reset