How Can We Be So Dense? The Benefits of Using Highly Sparse Representations

03/27/2019
by   Subutai Ahmad, et al.
0

Most artificial networks today rely on dense representations, whereas biological networks rely on sparse representations. In this paper we show how sparse representations can be more robust to noise and interference, as long as the underlying dimensionality is sufficiently high. A key intuition that we develop is that the ratio of the operable volume around a sparse vector divided by the volume of the representational space decreases exponentially with dimensionality. We then analyze computationally efficient sparse networks containing both sparse weights and activations. Simulations on MNIST and the Google Speech Command Dataset show that such networks demonstrate significantly improved robustness and stability compared to dense networks, while maintaining competitive accuracy. We discuss the potential benefits of sparsity on accuracy, noise robustness, hyperparameter tuning, learning speed, computational efficiency, and power requirements.

READ FULL TEXT
research
01/12/2015

Combined modeling of sparse and dense noise for improvement of Relevance Vector Machine

Using a Bayesian approach, we consider the problem of recovering sparse ...
research
12/09/2021

Densifying Sparse Representations for Passage Retrieval by Representational Slicing

Learned sparse and dense representations capture different successful ap...
research
12/27/2021

Two Sparsities Are Better Than One: Unlocking the Performance Benefits of Sparse-Sparse Networks

In principle, sparse neural networks should be significantly more effici...
research
07/10/2019

Sparse Networks from Scratch: Faster Training without Losing Performance

We demonstrate the possibility of what we call sparse learning: accelera...
research
11/07/2019

Transformation of Dense and Sparse Text Representations

Sparsity is regarded as a desirable property of representations, especia...
research
06/28/2021

FreeTickets: Accurate, Robust and Efficient Deep Ensemble by Training with Dynamic Sparsity

Recent works on sparse neural networks have demonstrated that it is poss...
research
04/12/2020

Minimizing FLOPs to Learn Efficient Sparse Representations

Deep representation learning has become one of the most widely adopted a...

Please sign up or login with your details

Forgot password? Click here to reset