Boltzmann Machine Learning with the Latent Maximum Entropy Principle

10/19/2012
by   Shaojun Wang, et al.
0

We present a new statistical learning paradigm for Boltzmann machines based on a new inference principle we have proposed: the latent maximum entropy principle (LME). LME is different both from Jaynes maximum entropy principle and from standard maximum likelihood estimation.We demonstrate the LME principle BY deriving new algorithms for Boltzmann machine parameter estimation, and show how robust and fast new variant of the EM algorithm can be developed.Our experiments show that estimation based on LME generally yields better results than maximum likelihood estimation, particularly when inferring hidden units from small amounts of data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/06/2019

A new inequality for maximum likelihood estimation in statistical models with latent variables

Maximum-likelihood estimation (MLE) is arguably the most important tool ...
research
12/30/2009

MedLDA: A General Framework of Maximum Margin Supervised Topic Models

Supervised topic models utilize document's side information for discover...
research
12/15/2010

Adaptive Parallel Tempering for Stochastic Maximum Likelihood Learning of RBMs

Restricted Boltzmann Machines (RBM) have attracted a lot of attention of...
research
09/10/2019

Inverse Ising inference from high-temperature re-weighting of observations

Maximum Likelihood Estimation (MLE) is the bread and butter of system in...
research
02/23/2021

Quantum Cross Entropy and Maximum Likelihood Principle

Quantum machine learning is an emerging field at the intersection of mac...
research
10/23/2021

Why Machine Learning Cannot Ignore Maximum Likelihood Estimation

The growth of machine learning as a field has been accelerating with inc...
research
01/16/2013

Maximum Entropy and the Glasses You Are Looking Through

We give an interpretation of the Maximum Entropy (MaxEnt) Principle in g...

Please sign up or login with your details

Forgot password? Click here to reset