Memory Capacity of a Random Neural Network

11/14/2012
by   Matt Stowe, et al.
0

This paper considers the problem of information capacity of a random neural network. The network is represented by matrices that are square and symmetrical. The matrices have a weight which determines the highest and lowest possible value found in the matrix. The examined matrices are randomly generated and analyzed by a computer program. We find the surprising result that the capacity of the network is a maximum for the binary random neural network and it does not change as the number of quantization levels associated with the weights increases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/12/2014

Memory Capacity of Neural Networks using a Circulant Weight Matrix

This paper presents results on the memory capacity of a generalized feed...
research
07/30/2013

Neural Network Capacity for Multilevel Inputs

This paper examines the memory capacity of generalized neural networks. ...
research
01/30/2018

Surjectivity of near square random matrices

We show that a nearly square iid random integral matrix is surjective ov...
research
07/02/2010

Delta Learning Rule for the Active Sites Model

This paper reports the results on methods of comparing the memory retrie...
research
06/13/2022

Why Quantization Improves Generalization: NTK of Binary Weight Neural Networks

Quantized neural networks have drawn a lot of attention as they reduce t...
research
11/25/2022

LU decomposition and Toeplitz decomposition of a neural network

It is well-known that any matrix A has an LU decomposition. Less well-kn...
research
06/22/2019

Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network

Memories in neural system are shaped through the interplay of neural and...

Please sign up or login with your details

Forgot password? Click here to reset