Restricted Boltzmann Machines as Models of Interacting Variables

by   Nicola Bulso, et al.

We study the type of distributions that Restricted Boltzmann Machines (RBMs) with different activation functions can express by investigating the effect of the activation function of the hidden nodes on the marginal distribution they impose on observed binary nodes. We report an exact expression for these marginals in the form of a model of interacting binary variables with the explicit form of the interactions depending on the hidden node activation function. We study the properties of these interactions in detail and evaluate how the accuracy with which the RBM approximates distributions over binary variables depends on the hidden node activation function and on the number of hidden nodes. When the inferred RBM parameters are weak, an intuitive pattern is found for the expression of the interaction terms which reduces substantially the differences across activation functions. We show that the weak parameter approximation is a good approximation for different RBMs trained on the MNIST dataset. Interestingly, in these cases, the mapping reveals that the inferred models are essentially low order interaction models.



There are no comments yet.


page 1

page 2

page 3

page 4


Learn-able parameter guided Activation Functions

In this paper, we explore the concept of adding learn-able slope and mea...

On the effect of the activation function on the distribution of hidden nodes in a deep network

We analyze the joint probability distribution on the lengths of the vect...

Using Restricted Boltzmann Machines to Model Molecular Geometries

Precise physical descriptions of molecules can be obtained by solving th...

Variations on the Chebyshev-Lagrange Activation Function

We seek to improve the data efficiency of neural networks and present no...

On the mapping between Hopfield networks and Restricted Boltzmann Machines

Hopfield networks (HNs) and Restricted Boltzmann Machines (RBMs) are two...

Deep Asymmetric Networks with a Set of Node-wise Variant Activation Functions

This work presents deep asymmetric networks with a set of node-wise vari...

Stochastic Neural Networks with Monotonic Activation Functions

We propose a Laplace approximation that creates a stochastic unit from a...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.