Winograd Algorithm for AdderNet

05/12/2021
by   Wenshuo Li, et al.
0

Adder neural network (AdderNet) is a new kind of deep model that replaces the original massive multiplications in convolutions by additions while preserving the high performance. Since the hardware complexity of additions is much lower than that of multiplications, the overall energy consumption is thus reduced significantly. To further optimize the hardware overhead of using AdderNet, this paper studies the winograd algorithm, which is a widely used fast algorithm for accelerating convolution and saving the computational costs. Unfortunately, the conventional Winograd algorithm cannot be directly applied to AdderNets since the distributive law in multiplication is not valid for the l1-norm. Therefore, we replace the element-wise multiplication in the Winograd equation by additions and then develop a new set of transform matrixes that can enhance the representation ability of output features to maintain the performance. Moreover, we propose the l2-to-l1 training strategy to mitigate the negative impacts caused by formal inconsistency. Experimental results on both FPGA and benchmarks show that the new method can further reduce the energy consumption without affecting the accuracy of the original AdderNet.

READ FULL TEXT
research
12/11/2019

DGEMM performance is data-dependent

The DGEMM function is a widely used implementation of the matrix product...
research
01/25/2021

AdderNet and its Minimalist Hardware Design for Energy-Efficient Artificial Intelligence

Convolutional neural networks (CNN) have been widely used for boosting t...
research
07/04/2020

A Novel Multi-Step Finite-State Automaton for Arbitrarily Deterministic Tsetlin Machine Learning

Due to the high energy consumption and scalability challenges of deep le...
research
12/31/2019

AdderNet: Do We Really Need Multiplications in Deep Learning?

Compared with cheap addition operation, multiplication operation is of m...
research
12/20/2022

Redistribution of Weights and Activations for AdderNet Quantization

Adder Neural Network (AdderNet) provides a new way for developing energy...
research
09/20/2023

Spiking NeRF: Making Bio-inspired Neural Networks See through the Real World

Spiking neuron networks (SNNs) have been thriving on numerous tasks to l...
research
12/07/2019

Generalized Data Placement Strategies for Racetrack Memories

Ultra-dense non-volatile racetrack memories (RTMs) have been investigate...

Please sign up or login with your details

Forgot password? Click here to reset