New Insights on Learning Rules for Hopfield Networks: Memory and Objective Function Minimisation

10/04/2020
by   Pavel Tolmachev, et al.
0

Hopfield neural networks are a possible basis for modelling associative memory in living organisms. After summarising previous studies in the field, we take a new look at learning rules, exhibiting them as descent-type algorithms for various cost functions. We also propose several new cost functions suitable for learning. We discuss the role of biases (the external inputs) in the learning process in Hopfield networks. Furthermore, we apply Newtons method for learning memories, and experimentally compare the performances of various learning rules. Finally, to add to the debate whether allowing connections of a neuron to itself enhances memory capacity, we numerically investigate the effects of self coupling. Keywords: Hopfield Networks, associative memory, content addressable memory, learning rules, gradient descent, attractor networks

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2022

Position-wise optimizer: A nature-inspired optimization algorithm

The human nervous system utilizes synaptic plasticity to solve optimizat...
research
01/25/2018

A New Backpropagation Algorithm without Gradient Descent

The backpropagation algorithm, which had been originally introduced in t...
research
01/07/2021

A Comprehensive Study on Optimization Strategies for Gradient Descent In Deep Learning

One of the most important parts of Artificial Neural Networks is minimiz...
research
07/06/2014

Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent

First-order methods play a central role in large-scale machine learning....
research
05/19/2018

Neural networks with dynamical coefficients and adjustable connections on the basis of integrated backpropagation

We consider artificial neurons which will update their weight coefficien...
research
11/28/2017

Backprop as Functor: A compositional perspective on supervised learning

A supervised learning algorithm searches over a set of functions A → B p...
research
01/10/2022

Learning Without a Global Clock: Asynchronous Learning in a Physics-Driven Learning Network

In a neuron network, synapses update individually using local informatio...

Please sign up or login with your details

Forgot password? Click here to reset