Dynamic Capacity Estimation in Hopfield Networks

09/15/2017
by   Saarthak Sarup, et al.
0

Understanding the memory capacity of neural networks remains a challenging problem in implementing artificial intelligence systems. In this paper, we address the notion of capacity with respect to Hopfield networks and propose a dynamic approach to monitoring a network's capacity. We define our understanding of capacity as the maximum number of stored patterns which can be retrieved when probed by the stored patterns. Prior work in this area has presented static expressions dependent on neuron count N, forcing network designers to assume worst-case input characteristics for bias and correlation when setting the capacity of the network. Instead, our model operates simultaneously with the learning Hopfield network and concludes on a capacity estimate based on the patterns which were stored. By continuously updating the crosstalk associated with the stored patterns, our model guards the network from overwriting its memory traces and exceeding its capacity. We simulate our model using artificially generated random patterns, which can be set to a desired bias and correlation, and observe capacity estimates between 93 97 networks in comparison to the static and worst-case capacity estimate while minimizing the risk of lost patterns.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/30/2013

Neural Network Capacity for Multilevel Inputs

This paper examines the memory capacity of generalized neural networks. ...
research
10/04/2018

A Practical Approach to Sizing Neural Networks

Memorization is worst-case generalization. Based on MacKay's information...
research
10/27/2019

Zero-Error Capacity of Multiple Access Channels via Nonstochastic Information

The problem of characterising the zero-error capacity region for multipl...
research
02/01/2022

Content addressable memory without catastrophic forgetting by heteroassociation with a fixed scaffold

Content-addressable memory (CAM) networks, so-called because stored item...
research
03/12/2014

Memory Capacity of Neural Networks using a Circulant Weight Matrix

This paper presents results on the memory capacity of a generalized feed...
research
03/03/2015

Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets

Despite the recent achievements in machine learning, we are still very f...
research
07/24/2014

Convolutional Neural Associative Memories: Massive Capacity with Noise Tolerance

The task of a neural associative memory is to retrieve a set of previous...

Please sign up or login with your details

Forgot password? Click here to reset