DeepAI AI Chat
Log In Sign Up

Theory of spike timing based neural classifiers

10/26/2010
by   Ran Rubin, et al.
0

We study the computational capacity of a model neuron, the Tempotron, which classifies sequences of spikes by linear-threshold operations. We use statistical mechanics and extreme value theory to derive the capacity of the system in random classification tasks. In contrast to its static analog, the Perceptron, the Tempotron's solutions space consists of a large number of small clusters of weight vectors. The capacity of the system per synapse is finite in the large size limit and weakly diverges with the stimulus duration relative to the membrane and synaptic time constants.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/08/2012

Efficient Design of Triplet Based Spike-Timing Dependent Plasticity

Spike-Timing Dependent Plasticity (STDP) is believed to play an importan...
05/11/2020

Synaptic Learning with Augmented Spikes

Traditional neuron models use analog values for information representati...
05/25/2020

Optimal Learning with Excitatory and Inhibitory synapses

Characterizing the relation between weight structure and input/output st...
12/02/2019

Capacity of the covariance perceptron

The classical perceptron is a simple neural network that performs a bina...
03/09/2018

On the information in spike timing: neural codes derived from polychronous groups

There is growing evidence regarding the importance of spike timing in ne...
02/19/2021

A theory of capacity and sparse neural encoding

Motivated by biological considerations, we study sparse neural maps from...