Dreaming neural networks: rigorous results

12/21/2018
by   Elena Agliari, et al.
0

Recently a daily routine for associative neural networks has been proposed: the network Hebbian-learns during the awake state (thus behaving as a standard Hopfield model), then, during its sleep state, optimizing information storage, it consolidates pure patterns and removes spurious ones: this forces the synaptic matrix to collapse to the projector one (ultimately approaching the Kanter-Sompolinksy model). This procedure keeps the learning Hebbian-based (a biological must) but, by taking advantage of a (properly stylized) sleep phase, still reaches the maximal critical capacity (for symmetric interactions). So far this emerging picture (as well as the bulk of papers on unlearning techniques) was supported solely by mathematically-challenging routes, e.g. mainly replica-trick analysis and numerical simulations: here we rely extensively on Guerra's interpolation techniques developed for neural networks and, in particular, we extend the generalized stochastic stability approach to the case. Confining our description within the replica symmetric approximation (where the previous ones lie), the picture painted regarding this generalization (and the previously existing variations on theme) is here entirely confirmed. Further, still relying on Guerra's schemes, we develop a systematic fluctuation analysis to check where ergodicity is broken (an analysis entirely absent in previous investigations). We find that, as long as the network is awake, ergodicity is bounded by the Amit-Gutfreund-Sompolinsky critical line (as it should), but, as the network sleeps, sleeping destroys spin glass states by extending both the retrieval as well as the ergodic region: after an entire sleeping session the solely surviving regions are retrieval and ergodic ones and this allows the network to achieve the perfect retrieval regime (the number of storable patterns equals the number of neurons in the network).

READ FULL TEXT
research
10/29/2018

Dreaming neural networks: forgetting spurious memories and reinforcing pure ones

The standard Hopfield model for associative neural networks accounts for...
research
07/17/2023

Statistical Mechanics of Learning via Reverberation in Bidirectional Associative Memories

We study bi-directional associative neural networks that, exposed to noi...
research
11/17/2022

Thermodynamics of bidirectional associative memories

In this paper we investigate the equilibrium properties of bidirectional...
research
11/25/2022

Dense Hebbian neural networks: a replica symmetric picture of supervised learning

We consider dense, associative neural-networks trained by a teacher (i.e...
research
04/17/2022

Recurrent neural networks that generalize from examples and optimize by dreaming

The gap between the huge volumes of data needed to train artificial neur...
research
12/02/2019

Interpolating between boolean and extremely high noisy patterns through Minimal Dense Associative Memories

Recently, Hopfield and Krotov introduced the concept of dense associati...

Please sign up or login with your details

Forgot password? Click here to reset