Interpolating between boolean and extremely high noisy patterns through Minimal Dense Associative Memories

12/02/2019
by   Francesco Alemanno, et al.
0

Recently, Hopfield and Krotov introduced the concept of dense associative memories [DAM] (close to spin-glasses with P-wise interactions in a disordered statistical mechanical jargon): they proved a number of remarkable features these networks share and suggested their use to (partially) explain the success of the new generation of Artificial Intelligence. Thanks to a remarkable ante-litteram analysis by Baldi & Venkatesh, among these properties, it is known these networks can handle a maximal amount of stored patterns K scaling as K ∼ N^P-1. In this paper, once introduced a minimal dense associative network as one of the most elementary cost-functions falling in this class of DAM, we sacrifice this high-load regime -namely we force the storage of solely a linear amount of patterns, i.e. K = α N (with α>0)- to prove that, in this regime, these networks can correctly perform pattern recognition even if pattern signal is O(1) and is embedded in a sea of noise O(√(N)), also in the large N limit. To prove this statement, by extremizing the quenched free-energy of the model over its natural order-parameters (the various magnetizations and overlaps), we derived its phase diagram, at the replica symmetric level of description and in the thermodynamic limit: as a sideline, we stress that, to achieve this task, aiming at cross-fertilization among disciplines, we pave two hegemon routes in the statistical mechanics of spin glasses, namely the replica trick and the interpolation technique. Both the approaches reach the same conclusion: there is a not-empty region, in the noise-T vs load-α phase diagram plane, where these networks can actually work in this challenging regime; in particular we obtained a quite high critical (linear) load in the (fast) noiseless case resulting in lim_β→∞α_c(β)=0.65.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2019

Neural networks with redundant representation: detecting the undetectable

We consider a three-layer Sejnowski machine and show that features learn...
research
11/25/2022

Dense Hebbian neural networks: a replica symmetric picture of supervised learning

We consider dense, associative neural-networks trained by a teacher (i.e...
research
08/08/2023

Parallel Learning by Multitasking Neural Networks

A modern challenge of Artificial Intelligence is learning multiple patte...
research
07/15/2020

Phase diagram for two-layer ReLU neural networks at infinite-width limit

How neural network behaves during the training over different choices of...
research
11/17/2022

Thermodynamics of bidirectional associative memories

In this paper we investigate the equilibrium properties of bidirectional...
research
04/28/2023

The Exponential Capacity of Dense Associative Memories

Recent generalizations of the Hopfield model of associative memories are...
research
12/21/2018

Dreaming neural networks: rigorous results

Recently a daily routine for associative neural networks has been propos...

Please sign up or login with your details

Forgot password? Click here to reset