SparseProp: Efficient Sparse Backpropagation for Faster Training of Neural Networks

02/09/2023
by   Mahdi Nikdan, et al.
0

We provide a new efficient version of the backpropagation algorithm, specialized to the case where the weights of the neural network being trained are sparse. Our algorithm is general, as it applies to arbitrary (unstructured) sparsity and common layer types (e.g., convolutional or linear). We provide a fast vectorized implementation on commodity CPUs, and show that it can yield speedups in end-to-end runtime experiments, both in transfer learning using already-sparsified networks, and in training sparse networks from scratch. Thus, our results provide the first support for sparse training on commodity hardware.

READ FULL TEXT
research
11/30/2020

A biologically plausible neural network for local supervision in cortical microcircuits

The backpropagation algorithm is an invaluable tool for training artific...
research
09/12/2016

Fully-Trainable Deep Matching

Deep Matching (DM) is a popular high-quality method for quasi-dense imag...
research
04/05/2020

Backprojection for Training Feedforward Neural Networks in the Input and Feature Spaces

After the tremendous development of neural networks trained by backpropa...
research
08/17/2023

TinyProp – Adaptive Sparse Backpropagation for Efficient TinyML On-device Learning

Training deep neural networks using backpropagation is very memory and c...
research
05/10/2021

A Bregman Learning Framework for Sparse Neural Networks

We propose a learning framework based on stochastic Bregman iterations t...
research
02/02/2021

Truly Sparse Neural Networks at Scale

Recently, sparse training methods have started to be established as a de...
research
07/12/2018

Training Neural Networks Using Features Replay

Training a neural network using backpropagation algorithm requires passi...

Please sign up or login with your details

Forgot password? Click here to reset