Dense Hebbian neural networks: a replica symmetric picture of supervised learning

11/25/2022
by   Elena Agliari, et al.
0

We consider dense, associative neural-networks trained by a teacher (i.e., with supervision) and we investigate their computational capabilities analytically, via statistical-mechanics of spin glasses, and numerically, via Monte Carlo simulations. In particular, we obtain a phase diagram summarizing their performance as a function of the control parameters such as quality and quantity of the training dataset, network storage and noise, that is valid in the limit of large network size and structureless datasets: these networks may work in a ultra-storage regime (where they can handle a huge amount of patterns, if compared with shallow neural networks) or in a ultra-detection regime (where they can perform pattern recognition at prohibitive signal-to-noise ratios, if compared with shallow neural networks). Guided by the random theory as a reference framework, we also test numerically learning, storing and retrieval capabilities shown by these networks on structured datasets as MNist and Fashion MNist. As technical remarks, from the analytic side, we implement large deviations and stability analysis within Guerra's interpolation to tackle the not-Gaussian distributions involved in the post-synaptic potentials while, from the computational counterpart, we insert Plefka approximation in the Monte Carlo scheme, to speed up the evaluation of the synaptic tensors, overall obtaining a novel and broad approach to investigate supervised learning in neural networks, beyond the shallow limit, in general.

READ FULL TEXT

page 3

page 20

page 22

page 23

research
11/25/2022

Dense Hebbian neural networks: a replica symmetric picture of unsupervised learning

We consider dense, associative neural-networks trained with no supervisi...
research
11/28/2019

Neural networks with redundant representation: detecting the undetectable

We consider a three-layer Sejnowski machine and show that features learn...
research
10/29/2018

Dreaming neural networks: forgetting spurious memories and reinforcing pure ones

The standard Hopfield model for associative neural networks accounts for...
research
12/02/2019

Interpolating between boolean and extremely high noisy patterns through Minimal Dense Associative Memories

Recently, Hopfield and Krotov introduced the concept of dense associati...
research
08/08/2023

Parallel Learning by Multitasking Neural Networks

A modern challenge of Artificial Intelligence is learning multiple patte...
research
09/01/2021

The emergence of a concept in shallow neural networks

We consider restricted Boltzmann machine (RBMs) trained over an unstruct...
research
12/21/2018

Dreaming neural networks: rigorous results

Recently a daily routine for associative neural networks has been propos...

Please sign up or login with your details

Forgot password? Click here to reset