Parallel Learning by Multitasking Neural Networks

08/08/2023
by   Elena Agliari, et al.
0

A modern challenge of Artificial Intelligence is learning multiple patterns at once (i.e.parallel learning). While this can not be accomplished by standard Hebbian associative neural networks, in this paper we show how the Multitasking Hebbian Network (a variation on theme of the Hopfield model working on sparse data-sets) is naturally able to perform this complex task. We focus on systems processing in parallel a finite (up to logarithmic growth in the size of the network) amount of patterns, mirroring the low-storage level of standard associative neural networks at work with pattern recognition. For mild dilution in the patterns, the network handles them hierarchically, distributing the amplitudes of their signals as power-laws w.r.t. their information content (hierarchical regime), while, for strong dilution, all the signals pertaining to all the patterns are raised with the same strength (parallel regime). Further, confined to the low-storage setting (i.e., far from the spin glass limit), the presence of a teacher neither alters the multitasking performances nor changes the thresholds for learning: the latter are the same whatever the training protocol is supervised or unsupervised. Results obtained through statistical mechanics, signal-to-noise technique and Monte Carlo simulations are overall in perfect agreement and carry interesting insights on multiple learning at once: for instance, whenever the cost-function of the model is minimized in parallel on several patterns (in its description via Statistical Mechanics), the same happens to the standard sum-squared error Loss function (typically used in Machine Learning).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2019

Neural networks with redundant representation: detecting the undetectable

We consider a three-layer Sejnowski machine and show that features learn...
research
01/05/2018

A relativistic extension of Hopfield neural networks via the mechanical analogy

We propose a modification of the cost function of the Hopfield model who...
research
11/25/2022

Dense Hebbian neural networks: a replica symmetric picture of supervised learning

We consider dense, associative neural-networks trained by a teacher (i.e...
research
12/02/2019

Interpolating between boolean and extremely high noisy patterns through Minimal Dense Associative Memories

Recently, Hopfield and Krotov introduced the concept of dense associati...
research
07/17/2023

Statistical Mechanics of Learning via Reverberation in Bidirectional Associative Memories

We study bi-directional associative neural networks that, exposed to noi...
research
11/25/2022

Dense Hebbian neural networks: a replica symmetric picture of unsupervised learning

We consider dense, associative neural-networks trained with no supervisi...
research
08/01/2018

Stock Chart Pattern recognition with Deep Learning

This study evaluates the performances of CNN and LSTM for recognizing co...

Please sign up or login with your details

Forgot password? Click here to reset