Memory capacity of neural networks with threshold and ReLU activations

01/20/2020
by   Roman Vershynin, et al.
7

Overwhelming theoretical and empirical evidence shows that mildly overparametrized neural networks – those with more connections than the size of the training data – are often able to memorize the training data with 100% accuracy. This was rigorously proved for networks with sigmoid activation functions and, very recently, for ReLU activations. Addressing a 1988 open question of Baum, we prove that this phenomenon holds for general multilayered perceptrons, i.e. neural networks with threshold activation functions, or with any mix of threshold and ReLU activations. Our construction is probabilistic and exploits sparsity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2021

Orthogonal-Padé Activation Functions: Trainable Activation functions for smooth and faster convergence in deep networks

We have proposed orthogonal-Padé activation functions, which are trainab...
research
07/28/2022

PEA: Improving the Performance of ReLU Networks for Free by Using Progressive Ensemble Activations

In recent years novel activation functions have been proposed to improve...
research
11/15/2021

Neural networks with linear threshold activations: structure and algorithms

In this article we present new results on neural networks with linear th...
research
03/06/2023

Globally Optimal Training of Neural Networks with Threshold Activation Functions

Threshold activation functions are highly preferable in neural networks ...
research
08/03/2023

Memory capacity of two layer neural networks with smooth activations

Determining the memory capacity of two-layer neural networks with m hidd...
research
09/09/2022

Fast Neural Kernel Embeddings for General Activations

Infinite width limit has shed light on generalization and optimization a...
research
10/16/2019

Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation

We study layered neural networks of rectified linear units (ReLU) in a m...

Please sign up or login with your details

Forgot password? Click here to reset