Dreaming neural networks: forgetting spurious memories and reinforcing pure ones

10/29/2018
by   Alberto Fachechi, et al.
0

The standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is α∼ 0.14, far from the theoretical bound for symmetric networks, i.e. α =1. Inspired by sleeping and dreaming mechanisms in mammal brains, we propose an extension of this model displaying the standard on-line (awake) learning mechanism (that allows the storage of external information in terms of patterns) and an off-line (sleep) unlearning&consolidating mechanism (that allows spurious-pattern removal and pure-pattern reinforcement): this obtained daily prescription is able to saturate the theoretical bound α=1, remaining also extremely robust against thermal noise. Both neural and synaptic features are analyzed both analytically and numerically. In particular, beyond obtaining a phase diagram for neural dynamics, we focus on synaptic plasticity and we give explicit prescriptions on the temporal evolution of the synaptic matrix. We analytically prove that our algorithm makes the Hebbian kernel converge with high probability to the projection matrix built over the pure stored patterns. Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in order to ensure such a convergence. Finally, we run extensive numerical simulations (mainly Monte Carlo sampling) to check the approximations underlying the analytical investigations (e.g., we developed the whole theory at the so called replica-symmetric level, as standard in the Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size effects, finding overall full agreement with the theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2018

Dreaming neural networks: rigorous results

Recently a daily routine for associative neural networks has been propos...
research
11/25/2022

Dense Hebbian neural networks: a replica symmetric picture of supervised learning

We consider dense, associative neural-networks trained by a teacher (i.e...
research
11/25/2022

Dense Hebbian neural networks: a replica symmetric picture of unsupervised learning

We consider dense, associative neural-networks trained with no supervisi...
research
05/31/2018

Forgetting Memories and their Attractiveness

We study numerically the memory which forgets, introduced in 1986 by Par...
research
11/28/2019

Neural networks with redundant representation: detecting the undetectable

We consider a three-layer Sejnowski machine and show that features learn...
research
04/17/2022

Recurrent neural networks that generalize from examples and optimize by dreaming

The gap between the huge volumes of data needed to train artificial neur...
research
11/17/2022

Thermodynamics of bidirectional associative memories

In this paper we investigate the equilibrium properties of bidirectional...

Please sign up or login with your details

Forgot password? Click here to reset