Memory Capacity of Neural Networks using a Circulant Weight Matrix

03/12/2014
by   Vamsi Sashank Kotagiri, et al.
0

This paper presents results on the memory capacity of a generalized feedback neural network using a circulant matrix. Children are capable of learning soon after birth which indicates that the neural networks of the brain have prior learnt capacity that is a consequence of the regular structures in the brain's organization. Motivated by this idea, we consider the capacity of circulant matrices as weight matrices in a feedback network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2012

Memory Capacity of a Random Neural Network

This paper considers the problem of information capacity of a random neu...
research
03/25/2011

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

The article presents new results on the use of variable thresholds to in...
research
07/31/2020

Neural Network Degeneration and its Relationship to the Brain

This report discusses the application of neural networks (NNs) as small ...
research
06/20/2017

Optimal modularity and memory capacity of neural networks

The neural network is a powerful computing framework that has been explo...
research
06/15/2020

Interaction Networks: Using a Reinforcement Learner to train other Machine Learning algorithms

The wiring of neurons in the brain is more flexible than the wiring of c...
research
09/15/2017

Dynamic Capacity Estimation in Hopfield Networks

Understanding the memory capacity of neural networks remains a challengi...
research
01/03/2018

Neural Networks in Adversarial Setting and Ill-Conditioned Weight Space

Recently, Neural networks have seen a huge surge in its adoption due to ...

Please sign up or login with your details

Forgot password? Click here to reset