Theory of spike timing based neural classifiers

10/26/2010
by   Ran Rubin, et al.
0

We study the computational capacity of a model neuron, the Tempotron, which classifies sequences of spikes by linear-threshold operations. We use statistical mechanics and extreme value theory to derive the capacity of the system in random classification tasks. In contrast to its static analog, the Perceptron, the Tempotron's solutions space consists of a large number of small clusters of weight vectors. The capacity of the system per synapse is finite in the large size limit and weakly diverges with the stimulus duration relative to the membrane and synaptic time constants.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/24/2012

Spike Timing Dependent Competitive Learning in Recurrent Self Organizing Pulsed Neural Networks Case Study: Phoneme and Word Recognition

Synaptic plasticity seems to be a capital aspect of the dynamics of neur...
research
05/11/2020

Synaptic Learning with Augmented Spikes

Traditional neuron models use analog values for information representati...
research
05/25/2020

Optimal Learning with Excitatory and Inhibitory synapses

Characterizing the relation between weight structure and input/output st...
research
12/02/2019

Capacity of the covariance perceptron

The classical perceptron is a simple neural network that performs a bina...
research
03/09/2018

On the information in spike timing: neural codes derived from polychronous groups

There is growing evidence regarding the importance of spike timing in ne...
research
02/19/2021

A theory of capacity and sparse neural encoding

Motivated by biological considerations, we study sparse neural maps from...

Please sign up or login with your details

Forgot password? Click here to reset