RSM
implementation of replicated softmax model
view repo
We introduce a Deep Boltzmann Machine model suitable for modeling and extracting latent semantic representations from a large unstructured collection of documents. We overcome the apparent difficulty of training a DBM with judicious parameter tying. This parameter tying enables an efficient pretraining algorithm and a state initialization scheme that aids inference. The model can be trained just as efficiently as a standard Restricted Boltzmann Machine. Our experiments show that the model assigns better log probability to unseen data than the Replicated Softmax model. Features extracted from our model outperform LDA, Replicated Softmax, and DocNADE models on document retrieval and document classification tasks.
READ FULL TEXT
We introduce a new method for training deep Boltzmann machines jointly. ...
read it
Scene models allow robots to reason about what is in the scene, what els...
read it
Restricted Boltzmann Machines (RBMs) are a class of generative neural ne...
read it
We present a layered Boltzmann machine (BM) that can better exploit the
...
read it
Replicated Softmax model, a well-known undirected topic model, is powerf...
read it
There are many advantages to use probability method for nonlinear system...
read it
Deep Boltzmann machines are in principle powerful models for extracting ...
read it
implementation of replicated softmax model
Comments
There are no comments yet.