A theory of capacity and sparse neural encoding

02/19/2021
by   Pierre Baldi, et al.
11

Motivated by biological considerations, we study sparse neural maps from an input layer to a target layer with sparse activity, and specifically the problem of storing K input-target associations (x,y), or memories, when the target vectors y are sparse. We mathematically prove that K undergoes a phase transition and that in general, and somewhat paradoxically, sparsity in the target layers increases the storage capacity of the map. The target vectors can be chosen arbitrarily, including in random fashion, and the memories can be both encoded and decoded by networks trained using local learning rules, including the simple Hebb rule. These results are robust under a variety of statistical assumptions on the data. The proofs rely on elegant properties of random polytopes and sub-gaussian random vector variables. Open problems and connections to capacity theories and polynomial threshold maps are discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/17/2022

Universality of empirical risk minimization

Consider supervised learning from i.i.d. samples { x_i,y_i}_i≤ n where x...
research
06/22/2015

A Theory of Local Learning, the Learning Channel, and the Optimality of Backpropagation

In a physical neural system, where storage and processing are intimately...
research
04/22/2021

Hierarchical growing grid networks for skeleton based action recognition

In this paper, a novel cognitive architecture for action recognition is ...
research
09/17/2019

Learned-SBL: A Deep Learning Architecture for Sparse Signal Recovery

In this paper, we present a computationally efficient sparse signal reco...
research
07/21/2023

What can a Single Attention Layer Learn? A Study Through the Random Features Lens

Attention layers – which map a sequence of inputs to a sequence of outpu...
research
03/13/2015

Sparse Code Formation with Linear Inhibition

Sparse code formation in the primary visual cortex (V1) has been inspira...
research
10/26/2010

Theory of spike timing based neural classifiers

We study the computational capacity of a model neuron, the Tempotron, wh...

Please sign up or login with your details

Forgot password? Click here to reset