Deep Belief Network Training Improvement Using Elite Samples Minimizing Free Energy

11/14/2014
by   Mohammad Ali Keyvanrad, et al.
0

Nowadays this is very popular to use deep architectures in machine learning. Deep Belief Networks (DBNs) are deep architectures that use stack of Restricted Boltzmann Machines (RBM) to create a powerful generative model using training data. In this paper we present an improvement in a common method that is usually used in training of RBMs. The new method uses free energy as a criterion to obtain elite samples from generative model. We argue that these samples can more accurately compute gradient of log probability of training data. According to the results, an error rate of 0.99 test set. This result shows that the proposed method outperforms the method presented in the first paper introducing DBN (1.25 classification methods such as SVM (1.4 rate). In another test using ISOLET dataset, letter classification error dropped to 3.59 this dataset. The implemented method is available online at "http://ceit.aut.ac.ir/ keyvanrad/DeeBNet Toolbox.html".

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset