Quantitative Universal Approximation Bounds for Deep Belief Networks

08/18/2022
by   Julian Sieber, et al.
0

We show that deep belief networks with binary hidden units can approximate any multivariate probability density under very mild integrability requirements on the parental density of the visible nodes. The approximation is measured in the L^q-norm for q∈[1,∞] (q=∞ corresponding to the supremum norm) and in Kullback-Leibler divergence. Furthermore, we establish sharp quantitative bounds on the approximation error in terms of the number of hidden units.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2013

Universal Approximation Depth and Errors of Narrow Belief Networks with Discrete Units

We generalize recent theoretical work on the minimal number of layers of...
research
03/24/2015

Universal Approximation of Markov Kernels by Shallow Stochastic Feedforward Networks

We establish upper bounds for the minimal number of hidden units for whi...
research
11/14/2014

Deep Narrow Boltzmann Machines are Universal Approximators

We show that deep narrow Boltzmann machines are universal approximators ...
research
04/05/2020

On Sharpness of Error Bounds for Multivariate Neural Network Approximation

Sharpness of error bounds for best non-linear multivariate approximation...
research
11/05/2012

Kernels and Submodels of Deep Belief Networks

We study the mixtures of factorizing probability distributions represent...
research
09/02/2023

Spectral Barron space and deep neural network approximation

We prove the sharp embedding between the spectral Barron space and the B...
research
01/23/2013

Faithful Approximations of Belief Functions

A conceptual foundation for approximation of belief functions is propose...

Please sign up or login with your details

Forgot password? Click here to reset