Deep Rewiring: Training very sparse deep networks

11/14/2017
by   Guillaume Bellec, et al.
0

Neuromorphic hardware tends to pose limits on the connectivity of deep networks that one can run on them. But also generic hardware and software implementations of deep learning run more efficiently on sparse networks. Several methods exist for pruning connections of a neural network after it was trained without connectivity constraints. We present an algorithm, DEEP R, that enables us to train directly a sparsely connected neural network. DEEP R automatically rewires the network during supervised training so that connections are there where they are most needed for the task, while its total number is all the time strictly bounded. We demonstrate that DEEP R can be used to train very sparse feedforward and recurrent neural networks on standard benchmark tasks with just a minor loss in performance. DEEP R is based on a rigorous theoretical foundation that views rewiring as stochastic sampling of network configurations from a posterior.

READ FULL TEXT
research
11/16/2016

Training Spiking Deep Networks for Neuromorphic Hardware

We describe a method to train spiking deep networks that can be run usin...
research
01/13/2022

Automatic Sparse Connectivity Learning for Neural Networks

Since sparse neural networks usually contain many zero weights, these un...
research
04/11/2019

Cramnet: Layer-wise Deep Neural Network Compression with Knowledge Transfer from a Teacher Network

Neural Networks accomplish amazing things, but they suffer from computat...
research
06/13/2018

The streaming rollout of deep networks - towards fully model-parallel execution

Deep neural networks, and in particular recurrent networks, are promisin...
research
08/07/2022

N2NSkip: Learning Highly Sparse Networks using Neuron-to-Neuron Skip Connections

The over-parametrized nature of Deep Neural Networks leads to considerab...
research
12/08/2014

Provable Methods for Training Neural Networks with Sparse Connectivity

We provide novel guaranteed approaches for training feedforward neural n...
research
07/30/2020

Growing Efficient Deep Networks by Structured Continuous Sparsification

We develop an approach to training deep networks while dynamically adjus...

Please sign up or login with your details

Forgot password? Click here to reset