Very Efficient Training of Convolutional Neural Networks using Fast Fourier Transform and Overlap-and-Add

01/25/2016
by   Tyler Highlander, et al.
0

Convolutional neural networks (CNNs) are currently state-of-the-art for various classification tasks, but are computationally expensive. Propagating through the convolutional layers is very slow, as each kernel in each layer must sequentially calculate many dot products for a single forward and backward propagation which equates to O(N^2n^2) per kernel per layer where the inputs are N × N arrays and the kernels are n × n arrays. Convolution can be efficiently performed as a Hadamard product in the frequency domain. The bottleneck is the transformation which has a cost of O(N^2_2 N) using the fast Fourier transform (FFT). However, the increase in efficiency is less significant when N≫ n as is the case in CNNs. We mitigate this by using the "overlap-and-add" technique reducing the computational complexity to O(N^2_2 n) per kernel. This method increases the algorithm's efficiency in both the forward and backward propagation, reducing the training and testing time for CNNs. Our empirical results show our method reduces computational time by a factor of up to 16.3 times the traditional convolution implementation for a 8 × 8 kernel and a 224 × 224 image.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/27/2020

Acceleration of Convolutional Neural Network Using FFT-Based Split Convolutions

Convolutional neural networks (CNNs) have a large number of variables an...
research
10/08/2020

Fast Fourier Transformation for Optimizing Convolutional Neural Networks in Object Recognition

This paper proposes to use Fast Fourier Transformation-based U-Net (a re...
research
01/27/2015

maxDNN: An Efficient Convolution Kernel for Deep Learning with Maxwell GPUs

This paper describes maxDNN, a computationally efficient convolution ker...
research
10/29/2019

Derivation and Analysis of Fast Bilinear Algorithms for Convolution

The prevalence of convolution in applications within signal processing, ...
research
07/26/2021

Circular-Symmetric Correlation Layer based on FFT

Despite the vast success of standard planar convolutional neural network...
research
04/28/2015

Speeding Up Neural Networks for Large Scale Classification using WTA Hashing

In this paper we propose to use the Winner Takes All hashing technique t...
research
11/25/2020

Deep Convolutional Neural Networks: A survey of the foundations, selected improvements, and some current applications

Within the world of machine learning there exists a wide range of differ...

Please sign up or login with your details

Forgot password? Click here to reset