Log In Sign Up

Randomly Initialized One-Layer Neural Networks Make Data Linearly Separable

by   Promit Ghosal, et al.

Recently, neural networks have been shown to perform exceptionally well in transforming two arbitrary sets into two linearly separable sets. Doing this with a randomly initialized neural network is of immense interest because the associated computation is cheaper than using fully trained networks. In this paper, we show that, with sufficient width, a randomly initialized one-layer neural network transforms two sets into two linearly separable sets with high probability. Furthermore, we provide explicit bounds on the required width of the neural network for this to occur. Our first bound is exponential in the input dimension and polynomial in all other parameters, while our second bound is independent of the input dimension, thereby overcoming the curse of dimensionality. We also perform an experimental study comparing the separation capacity of randomly initialized one-layer and two-layer neural networks. With correctly chosen biases, our study shows for low-dimensional data, the two-layer neural network outperforms the one-layer network. However, the opposite is observed for higher-dimensional data.


page 1

page 2

page 3

page 4


The Separation Capacity of Random Neural Networks

Neural networks with random weights appear in a variety of machine learn...

Adversarial Noises Are Linearly Separable for (Nearly) Random Neural Networks

Adversarial examples, which are usually generated for specific inputs wi...

Feature Space Saturation during Training

We propose layer saturation - a simple, online-computable method for ana...

Kernel similarity matching with Hebbian neural networks

Recent works have derived neural networks with online correlation-based ...

How and what to learn:The modes of machine learning

We proposal a new approach, namely the weight pathway analysis (WPA), to...

Shallow Neural Network can Perfectly Classify an Object following Separable Probability Distribution

Guiding the design of neural networks is of great importance to save eno...

Linear and Fisher Separability of Random Points in the d-dimensional Spherical Layer

Stochastic separation theorems play important role in high-dimensional d...