Neural Network Capacity for Multilevel Inputs

07/30/2013
by   Matt Stowe, et al.
0

This paper examines the memory capacity of generalized neural networks. Hopfield networks trained with a variety of learning techniques are investigated for their capacity both for binary and non-binary alphabets. It is shown that the capacity can be much increased when multilevel inputs are used. New learning strategies are proposed to increase Hopfield network capacity, and the scalability of these methods is also examined in respect to size of the network. The ability to recall entire patterns from stimulation of a single neuron is examined for the increased capacity networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/25/2011

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

The article presents new results on the use of variable thresholds to in...
research
09/15/2017

Dynamic Capacity Estimation in Hopfield Networks

Understanding the memory capacity of neural networks remains a challengi...
research
06/22/2019

Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network

Memories in neural system are shaped through the interplay of neural and...
research
03/13/2021

Conceptual capacity and effective complexity of neural networks

We propose a complexity measure of a neural network mapping function bas...
research
11/14/2012

Memory Capacity of a Random Neural Network

This paper considers the problem of information capacity of a random neu...
research
07/18/2006

Neural Networks with Complex and Quaternion Inputs

This article investigates Kak neural networks, which can be instantaneou...
research
10/07/2020

High-Capacity Expert Binary Networks

Network binarization is a promising hardware-aware direction for creatin...

Please sign up or login with your details

Forgot password? Click here to reset