An Exponential Improvement on the Memorization Capacity of Deep Threshold Networks

06/14/2021
by   Shashank Rajput, et al.
0

It is well known that modern deep neural networks are powerful enough to memorize datasets even when the labels have been randomized. Recently, Vershynin (2020) settled a long standing question by Baum (1988), proving that deep threshold networks can memorize n points in d dimensions using 𝒪(e^1/δ^2+√(n)) neurons and 𝒪(e^1/δ^2(d+√(n))+n) weights, where δ is the minimum distance between the points. In this work, we improve the dependence on δ from exponential to almost linear, proving that 𝒪(1/δ+√(n)) neurons and 𝒪(d/δ+n) weights are sufficient. Our construction uses Gaussian random weights only in the first layer, while all the subsequent layers use binary or integer weights. We also prove new lower bounds by connecting memorization in neural networks to the purely geometric problem of separating n points on a sphere using hyperplanes.

READ FULL TEXT
research
06/04/2020

Network size and weights size for memorization with two-layers neural networks

In 1988, Eric B. Baum showed that two-layers neural networks with thresh...
research
01/02/2019

The capacity of feedforward neural networks

A long standing open problem in the theory of neural networks is the dev...
research
06/18/2022

Coin Flipping Neural Networks

We show that neural networks with access to randomness can outperform de...
research
11/25/2022

Bypass Exponential Time Preprocessing: Fast Neural Network Training via Weight-Data Correlation Preprocessing

Over the last decade, deep neural networks have transformed our society,...
research
05/20/2022

Memorization and Optimization in Deep Neural Networks with Minimum Over-parameterization

The Neural Tangent Kernel (NTK) has emerged as a powerful tool to provid...
research
07/17/2019

On the geometry of solutions and on the capacity of multi-layer neural networks with ReLU activations

Rectified Linear Units (ReLU) have become the main model for the neural ...
research
10/18/2021

Finding Everything within Random Binary Networks

A recent work by Ramanujan et al. (2020) provides significant empirical ...

Please sign up or login with your details

Forgot password? Click here to reset