Learning may need only a few bits of synaptic precision

02/12/2016
by   Carlo Baldassi, et al.
0

Learning in neural networks poses peculiar challenges when using discretized rather then continuous synaptic states. The choice of discrete synapses is motivated by biological reasoning and experiments, and possibly by hardware implementation considerations as well. In this paper we extend a previous large deviations analysis which unveiled the existence of peculiar dense regions in the space of synaptic states which accounts for the possibility of learning efficiently in networks with binary synapses. We extend the analysis to synapses with multiple states and generally more plausible biological features. The results clearly indicate that the overall qualitative picture is unchanged with respect to the binary case, and very robust to variation of the details of the model. We also provide quantitative results which suggest that the advantages of increasing the synaptic precision (i.e. the number of internal synaptic states) rapidly vanish after the first few bits, and therefore that, for practical applications, only few bits may be needed for near-optimal performance, consistently with recent biological findings. Finally, we demonstrate how the theoretical analysis can be exploited to design efficient algorithmic search strategies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/18/2015

Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses

We show that discrete synaptic weights can be efficiently used for learn...
research
10/26/2017

On the role of synaptic stochasticity in training low-precision neural networks

Stochasticity and limited precision of synaptic weights in neural networ...
research
08/02/2018

Memristor-based Synaptic Sampling Machines

Synaptic Sampling Machine (SSM) is a type of neural network model that c...
research
07/01/2014

Supervised learning in Spiking Neural Networks with Limited Precision: SNN/LP

A new supervised learning algorithm, SNN/LP, is proposed for Spiking Neu...
research
05/29/2014

Experimental Demonstration of Array-level Learning with Phase Change Synaptic Devices

The computational performance of the biological brain has long attracted...
research
06/19/2014

Brain-like associative learning using a nanoscale non-volatile phase change synaptic device array

Recent advances in neuroscience together with nanoscale electronic devic...
research
09/20/2016

The Digital Synaptic Neural Substrate: Size and Quality Matters

We investigate the 'Digital Synaptic Neural Substrate' (DSNS) computatio...

Please sign up or login with your details

Forgot password? Click here to reset