About Learning in Recurrent Bistable Gradient Networks
Recurrent Bistable Gradient Networks are attractor based neural networks characterized by bistable dynamics of each single neuron. Coupled together using linear interaction determined by the interconnection weights, these networks do not suffer from spurious states or very limited capacity anymore. Vladimir Chinarov and Michael Menzinger, who invented these networks, trained them using Hebb's learning rule. We show, that this way of computing the weights leads to unwanted behaviour and limitations of the networks capabilities. Furthermore we evince, that using the first order of Hintons Contrastive Divergence algorithm leads to a quite promising recurrent neural network. These findings are tested by learning images of the MNIST database for handwritten numbers.
READ FULL TEXT