Pavlov Learning Machines

07/02/2022
by   Elena Agliari, et al.
0

As well known, Hebb's learning traces its origin in Pavlov's Classical Conditioning, however, while the former has been extensively modelled in the past decades (e.g., by Hopfield model and countless variations on theme), as for the latter modelling has remained largely unaddressed so far; further, a bridge between these two pillars is totally lacking. The main difficulty towards this goal lays in the intrinsically different scales of the information involved: Pavlov's theory is about correlations among concepts that are (dynamically) stored in the synaptic matrix as exemplified by the celebrated experiment starring a dog and a ring bell; conversely, Hebb's theory is about correlations among pairs of adjacent neurons as summarized by the famous statement neurons that fire together wire together. In this paper we rely on stochastic-process theory and model neural and synaptic dynamics via Langevin equations, to prove that – as long as we keep neurons' and synapses' timescales largely split – Pavlov mechanism spontaneously takes place and ultimately gives rise to synaptic weights that recover the Hebbian kernel.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset