Computing threshold functions using dendrites

11/10/2016
by   Romain Cazé, et al.
0

Neurons, modeled as linear threshold unit (LTU), can in theory compute all thresh- old functions. In practice, however, some of these functions require synaptic weights of arbitrary large precision. We show here that dendrites can alleviate this requirement. We introduce here the non-Linear Threshold Unit (nLTU) that integrates synaptic input sub-linearly within distinct subunits to take into account local saturation in dendrites. We systematically search parameter space of the nTLU and TLU to compare them. Firstly, this shows that the nLTU can compute all threshold functions with smaller precision weights than the LTU. Secondly, we show that a nLTU can compute significantly more functions than a LTU when an input can only make a single synapse. This work paves the way for a new generation of network made of nLTU with binary synapses.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro