Capacity of the covariance perceptron

12/02/2019
by   David Dahmen, et al.
0

The classical perceptron is a simple neural network that performs a binary classification by a linear mapping between static inputs and outputs and application of a threshold. For small inputs, neural networks in a stationary state also perform an effectively linear input-output transformation, but of an entire time series. Choosing the temporal mean of the time series as the feature for classification, the linear transformation of the network with subsequent thresholding is equivalent to the classical perceptron. Here we show that choosing covariances of time series as the feature for classification maps the neural network to what we call a 'covariance perceptron'; a bilinear mapping between covariances. By extending Gardner's theory of connections to this bilinear problem, using a replica symmetric mean-field theory, we compute the pattern and information capacities of the covariance perceptron in the infinite-size limit. Closed-form expressions reveal superior pattern capacity in the binary classification task compared to the classical perceptron in the case of a high-dimensional input and low-dimensional output. For less convergent networks, the mean perceptron classifies a larger number of stimuli. However, since covariances span a much larger input and output space than means, the amount of stored information in the covariance perceptron exceeds the classical counterpart. For strongly convergent connectivity it is superior by a factor equal to the number of input neurons. Theoretical calculations are validated numerically for finite size systems using a gradient-based optimization of a soft-margin, as well as numerical solvers for the NP hard quadratically constrained quadratic programming problem, to which training can be mapped.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/11/2020

Linear Dilation-Erosion Perceptron for Binary Classification

In this work, we briefly revise the reduced dilation-erosion perceptron ...
research
12/14/2020

Perceptron Theory for Predicting the Accuracy of Neural Networks

Many neural network models have been successful at classification proble...
research
06/19/2020

An analytic theory of shallow networks dynamics for hinge loss classification

Neural networks have been shown to perform incredibly well in classifica...
research
10/26/2010

Theory of spike timing based neural classifiers

We study the computational capacity of a model neuron, the Tempotron, wh...
research
09/14/2016

Very Simple Classifier: a Concept Binary Classifier toInvestigate Features Based on Subsampling and Localility

We propose Very Simple Classifier (VSC) a novel method designed to incor...
research
03/09/2018

On Optimal Polyline Simplification using the Hausdorff and Fréchet Distance

We revisit the classical polygonal line simplification problem and study...

Please sign up or login with your details

Forgot password? Click here to reset