Efficient forward propagation of time-sequences in convolutional neural networks using Deep Shifting

03/11/2016
by   Koen Groenland, et al.
0

When a Convolutional Neural Network is used for on-the-fly evaluation of continuously updating time-sequences, many redundant convolution operations are performed. We propose the method of Deep Shifting, which remembers previously calculated results of convolution operations in order to minimize the number of calculations. The reduction in complexity is at least a constant and in the best case quadratic. We demonstrate that this method does indeed save significant computation time in a practical implementation, especially when the networks receives a large number of time-frames.

READ FULL TEXT
research
03/23/2018

Convolutions of Liouvillian Sequences

While Liouvillian sequences are closed under many operations, simple exa...
research
12/15/2014

Highly Efficient Forward and Backward Propagation of Convolutional Neural Networks for Pixelwise Classification

We present highly efficient algorithms for performing forward and backwa...
research
04/22/2020

DyNet: Dynamic Convolution for Accelerating Convolutional Neural Networks

Convolution operator is the core of convolutional neural networks (CNNs)...
research
06/05/2018

EasyConvPooling: Random Pooling with Easy Convolution for Accelerating Training and Testing

Convolution operations dominate the overall execution time of Convolutio...
research
10/11/2021

Two-level Group Convolution

Group convolution has been widely used in order to reduce the computatio...
research
12/27/2019

nnAudio: An on-the-fly GPU Audio to Spectrogram Conversion Toolbox Using 1D Convolution Neural Networks

Converting time domain waveforms to frequency domain spectrograms is typ...
research
05/09/2019

Convolutional Neural Networks Utilizing Multifunctional Spin-Hall MTJ Neurons

We propose a new network architecture for standard spin-Hall magnetic tu...

Please sign up or login with your details

Forgot password? Click here to reset