Enhanced Expressive Power and Fast Training of Neural Networks by Random Projections

11/22/2018
by   Jian-Feng Cai, et al.
0

Random projections are able to perform dimension reduction efficiently for datasets with nonlinear low-dimensional structures. One well-known example is that random matrices embed sparse vectors into a low-dimensional subspace nearly isometrically, known as the restricted isometric property in compressed sensing. In this paper, we explore some applications of random projections in deep neural networks. We provide the expressive power of fully connected neural networks when the input data are sparse vectors or form a low-dimensional smooth manifold. We prove that the number of neurons required for approximating a Lipschitz function with a prescribed precision depends on the sparsity or the dimension of the manifold and weakly on the dimension of the input vector. The key in our proof is that random projections embed stably the set of sparse vectors or a low-dimensional smooth manifold into a low-dimensional subspace. Based on this fact, we also propose some new neural network models, where at each layer the input is first projected onto a low-dimensional subspace by a random projection and then the standard linear connection and non-linear activation are applied. In this way, the number of parameters in neural networks is significantly reduced, and therefore the training of neural networks can be accelerated without too much performance loss.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/30/2018

Rigorous Restricted Isometry Property of Low-Dimensional Subspaces

Dimensionality reduction is in demand to reduce the complexity of solvin...
research
12/02/2021

Level set learning with pseudo-reversible neural networks for nonlinear dimension reduction in function approximation

Due to the curse of dimensionality and the limitation on training data, ...
research
12/20/2014

Outperforming Word2Vec on Analogy Tasks with Random Projections

We present a distributed vector representation based on a simplification...
research
11/09/2020

Improving Neural Network Training in Low Dimensional Random Bases

Stochastic Gradient Descent (SGD) has proven to be remarkably effective ...
research
10/05/2020

Subspace Embeddings Under Nonlinear Transformations

We consider low-distortion embeddings for subspaces under entrywise nonl...
research
11/23/2018

Estimating of the inertial manifold dimension for a chaotic attractor of complex Ginzburg-Landau equation using a neural network

Dimension of an inertial manifold for a chaotic attractor of spatially d...
research
12/26/2018

Towards a Theoretical Understanding of Hashing-Based Neural Nets

Parameter reduction has been an important topic in deep learning due to ...

Please sign up or login with your details

Forgot password? Click here to reset