Winograd Convolution for Deep Neural Networks: Efficient Point Selection

01/25/2022
by   Syed Asad Alam, et al.
0

Convolutional neural networks (CNNs) have dramatically improved the accuracy of tasks such as object recognition, image segmentation and interactive speech systems. CNNs require large amounts of computing resources because ofcomputationally intensive convolution layers. Fast convolution algorithms such as Winograd convolution can greatly reduce the computational cost of these layers at a cost of poor numeric properties, such that greater savings in computation exponentially increase floating point errors. A defining feature of each Winograd convolution algorithm is a set of real-value points where polynomials are sampled. The choice of points impacts the numeric accuracy of the algorithm, but the optimal set of points for small convolutions remains unknown. Existing work considers only small integers and simple fractions as candidate points. In this work, we propose a novel approach to point selection using points of the form -1/c , -c, c, 1/c using the full range of real-valued numbers for c. We show that groups of this form cause cancellations in the Winograd transform matrices that reduce numeric error. We find empirically that the error for different values of c forms a rough curve across the range of real-value numbers helping to localize the values of c that reduce error and that lower errors can be achieved with non-obvious real-valued evaluation points instead of integers or simple fractions. We study a range of sizes for small convolutions and achieve reduction in error ranging from 2 around 59 cases when we select a subset of our proposed points which will always lead to a lower error. Finally we implement a complete Winograd convolution layer and use it to run deep convolution neural networks on real datasets and show that our proposed points reduce error, ranging from 22

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2018

Improving accuracy of Winograd convolution for DNNs

Modern deep neural networks (DNNs) spend a large amount of their executi...
research
03/29/2018

Error Analysis and Improving the Accuracy of Winograd Convolution for Deep Neural Networks

Modern deep neural networks (DNNs) spend a large amount of their executi...
research
03/03/2021

An Alternative Practice of Tropical Convolution to Traditional Convolutional Neural Networks

Convolutional neural networks (CNNs) have been used in many machine lear...
research
11/04/2018

Bi-Real Net: Binarizing Deep Network Towards Real-Network Performance

In this paper, we study 1-bit convolutional neural networks (CNNs), of w...
research
11/19/2018

Building Efficient Deep Neural Networks with Unitary Group Convolutions

We propose unitary group convolutions (UGConvs), a building block for CN...
research
01/04/2021

DSXplore: Optimizing Convolutional Neural Networks via Sliding-Channel Convolutions

As the key advancement of the convolutional neural networks (CNNs), dept...
research
05/24/2019

Training decision trees as replacement for convolution layers

We present an alternative layer to convolution layers in convolutional n...

Please sign up or login with your details

Forgot password? Click here to reset