On the Expressive Efficiency of Sum Product Networks

11/27/2014
by   James Martens, et al.
0

Sum Product Networks (SPNs) are a recently developed class of deep generative models which compute their associated unnormalized density functions using a special type of arithmetic circuit. When certain sufficient conditions, called the decomposability and completeness conditions (or "D&C" conditions), are imposed on the structure of these circuits, marginal densities and other useful quantities, which are typically intractable for other deep generative models, can be computed by what amounts to a single evaluation of the network (which is a property known as "validity"). However, the effect that the D&C conditions have on the capabilities of D&C SPNs is not well understood. In this work we analyze the D&C conditions, expose the various connections that D&C SPNs have with multilinear arithmetic circuits, and consider the question of how well they can capture various distributions as a function of their size and depth. Among our various contributions is a result which establishes the existence of a relatively simple distribution with fully tractable marginal densities which cannot be efficiently captured by D&C SPNs of any depth, but which can be efficiently captured by various other deep generative models. We also show that with each additional layer of depth permitted, the set of distributions which can be efficiently captured by D&C SPNs grows in size. This kind of "depth hierarchy" property has been widely conjectured to hold for various deep models, but has never been proven for any of them. Some of our other contributions include a new characterization of the D&C conditions as sufficient and necessary ones for a slightly strengthened notion of validity, and various state-machine characterizations of the types of computations that can be performed efficiently by D&C SPNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2017

Sum-Product-Quotient Networks

We present a novel tractable generative model that extends Sum-Product N...
research
02/16/2023

Understanding the Distillation Process from Deep Generative Models to Tractable Probabilistic Circuits

Probabilistic Circuits (PCs) are a general and unified computational fra...
research
09/14/2021

Sum-Product-Attention Networks: Leveraging Self-Attention in Probabilistic Circuits

Probabilistic circuits (PCs) have become the de-facto standard for learn...
research
10/31/2017

Flexible Prior Distributions for Deep Generative Models

We consider the problem of training generative models with deep neural n...
research
11/21/2018

Spread Divergences

For distributions p and q with different support, the divergence general...
research
10/13/2016

Tractable Generative Convolutional Arithmetic Circuits

Casting neural networks in generative frameworks is a highly sought-afte...
research
08/22/2017

On Relaxing Determinism in Arithmetic Circuits

The past decade has seen a significant interest in learning tractable pr...

Please sign up or login with your details

Forgot password? Click here to reset