On the Sample Complexity of Learning Sum-Product Networks

12/05/2019
by   Ishaq Aden-Ali, et al.
0

Sum-Product Networks (SPNs) can be regarded as a form of deep graphical models that compactly represent deeply factored and mixed distributions. An SPN is a rooted directed acyclic graph (DAG) consisting of a set of leaves (corresponding to base distributions), a set of sum nodes (which represent mixtures of their children distributions) and a set of product nodes (representing the products of its children distributions). In this work, we initiate the study of the sample complexity of PAC-learning the set of distributions that correspond to SPNs. We show that the sample complexity of learning tree structured SPNs with the usual type of leaves (i.e., Gaussian or discrete) grows at most linearly (up to logarithmic factors) with the number of parameters of the SPN. More specifically, we show that the class of distributions that corresponds to tree structured Gaussian SPNs with k mixing weights and e (d-dimensional Gaussian) leaves can be learned within Total Variation error ϵ using at most O(ed^2+k/ϵ^2) samples. A similar result holds for tree structured SPNs with discrete leaves. We obtain the upper bounds based on the recently proposed notion of distribution compression schemes. More specifically, we show that if a (base) class of distributions F admits an "efficient" compression, then the class of tree structured SPNs with leaves from F also admits an efficient compression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/14/2017

Agnostic Distribution Learning via Compression

We study sample-efficient distribution learning, where a learner is give...
research
11/23/2022

Learning and Testing Latent-Tree Ising Models Efficiently

We provide time- and sample-efficient algorithms for learning and testin...
research
12/13/2022

How Does Independence Help Generalization? Sample Complexity of ERM on Product Distributions

While many classical notions of learnability (e.g., PAC learnability) ar...
research
04/22/2016

Learning a Tree-Structured Ising Model in Order to Make Predictions

We study the problem of learning a tree graphical model from samples suc...
research
12/11/2018

Predictive Learning on Hidden Tree-Structured Ising Models

We provide high-probability sample complexity guarantees for exact struc...
research
12/11/2018

Predictive Learning on Sign-Valued Hidden Markov Trees

We provide high-probability sample complexity guarantees for exact struc...
research
02/20/2020

Learning Gaussian Graphical Models via Multiplicative Weights

Graphical model selection in Markov random fields is a fundamental probl...

Please sign up or login with your details

Forgot password? Click here to reset