Improving Sparse Associative Memories by Escaping from Bogus Fixed Points

08/27/2013
by   Zhe Yao, et al.
0

The Gripon-Berrou neural network (GBNN) is a recently invented recurrent neural network embracing a LDPC-like sparse encoding setup which makes it extremely resilient to noise and errors. A natural use of GBNN is as an associative memory. There are two activation rules for the neuron dynamics, namely sum-of-sum and sum-of-max. The latter outperforms the former in terms of retrieval rate by a huge margin. In prior discussions and experiments, it is believed that although sum-of-sum may lead the network to oscillate, sum-of-max always converges to an ensemble of neuron cliques corresponding to previously stored patterns. However, this is not entirely correct. In fact, sum-of-max often converges to bogus fixed points where the ensemble only comprises a small subset of the converged state. By taking advantage of this overlooked fact, we can greatly improve the retrieval rate. We discuss this particular issue and propose a number of heuristics to push sum-of-max beyond these bogus fixed points. To tackle the problem directly and completely, a novel post-processing algorithm is also developed and customized to the structure of GBNN. Experimental results show that the new algorithm achieves a huge performance boost in terms of both retrieval rate and run-time, compared to the standard sum-of-max and all the other heuristics.

READ FULL TEXT

page 10

page 11

research
03/28/2013

A Massively Parallel Associative Memory Based on Sparse Neural Networks

Associative memories store content in such a way that the content can be...
research
12/29/2022

Intersecting ellipses induced by a max-sum matching

For an even set of points in the plane, choose a max-sum matching, that ...
research
12/22/2011

Zero-Temperature Limit of a Convergent Algorithm to Minimize the Bethe Free Energy

After the discovery that fixed points of loopy belief propagation coinci...
research
07/16/2022

Approximation Capabilities of Neural Networks using Morphological Perceptrons and Generalizations

Standard artificial neural networks (ANNs) use sum-product or multiply-a...
research
09/18/2018

Learning Universal Sentence Representations with Mean-Max Attention Autoencoder

In order to learn universal sentence representations, previous methods f...
research
02/26/2020

Polynomial algorithms for p-dispersion problems in a 2d Pareto Front

Having many best compromise solutions for bi-objective optimization prob...
research
09/15/2018

Completely Uncoupled Algorithms for Network Utility Maximization

In this paper, we present two completely uncoupled algorithms for utilit...

Please sign up or login with your details

Forgot password? Click here to reset