Boltzmann machines and energy-based models

08/20/2017
by   Takayuki Osogami, et al.
0

We review Boltzmann machines and energy-based models. A Boltzmann machine defines a probability distribution over binary-valued patterns. One can learn parameters of a Boltzmann machine via gradient based approaches in a way that log likelihood of data is increased. The gradient and Laplacian of a Boltzmann machine admit beautiful mathematical representations, although computing them is in general intractable. This intractability motivates approximate methods, including Gibbs sampler and contrastive divergence, and tractable alternatives, namely energy-based models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/21/2018

Transductive Boltzmann Machines

We present transductive Boltzmann machines (TBMs), which firstly achieve...
research
04/23/2018

Boltzmann Encoded Adversarial Machines

Restricted Boltzmann Machines (RBMs) are a class of generative neural ne...
research
12/09/2022

Attention in a family of Boltzmann machines emerging from modern Hopfield networks

Hopfield networks and Boltzmann machines (BMs) are fundamental energy-ba...
research
11/21/2019

TMI: Thermodynamic inference of data manifolds

The Gibbs-Boltzmann distribution offers a physically interpretable way t...
research
01/23/2023

Explaining the effects of non-convergent sampling in the training of Energy-Based Models

In this paper, we quantify the impact of using non-convergent Markov cha...
research
06/20/2022

A Langevin-like Sampler for Discrete Distributions

We propose discrete Langevin proposal (DLP), a simple and scalable gradi...
research
02/08/2021

Oops I Took A Gradient: Scalable Sampling for Discrete Distributions

We propose a general and scalable approximate sampling strategy for prob...

Please sign up or login with your details

Forgot password? Click here to reset