Effects of Approximate Multiplication on Convolutional Neural Networks

07/20/2020
by   Min Soo Kim, et al.
0

This paper analyzes the effects of approximate multiplication when performing inferences on deep convolutional neural networks (CNNs). The approximate multiplication can reduce the cost of underlying circuits so that CNN inferences can be performed more efficiently in hardware accelerators. The study identifies the critical factors in the convolution, fully-connected, and batch normalization layers that allow more accurate CNN predictions despite the errors from approximate multiplication. The same factors also provide an arithmetic explanation of why bfloat16 multiplication performs well on CNNs. The experiments are performed with recognized network architectures to show that the approximate multipliers can produce predictions that are nearly as accurate as the FP32 references, without additional training. For example, the ResNet and Inception-v4 models with Mitch-w6 multiplication produces Top-5 errors that are within 0.2 comparison of Mitch-w6 against bfloat16 is presented, where a MAC operation saves up to 80 far-reaching contribution of this paper is the analytical justification that multiplications can be approximated while additions need to be exact in CNN MAC operations.

READ FULL TEXT

page 3

page 4

page 8

page 12

research
05/21/2020

Evaluation of deep convolutional neural networks in classifying human embryo images based on their morphological quality

A critical factor that influences the success of an in-vitro fertilizati...
research
05/22/2017

Training Deep Convolutional Neural Networks with Resistive Cross-Point Devices

In a previous work we have detailed the requirements to obtain a maximal...
research
01/23/2021

MinConvNets: A new class of multiplication-less Neural Networks

Convolutional Neural Networks have achieved unprecedented success in ima...
research
12/26/2019

Deep Learning Training with Simulated Approximate Multipliers

This paper presents by simulation how approximate multipliers can be uti...
research
12/11/2017

StrassenNets: Deep learning with a multiplication budget

A large fraction of the arithmetic operations required to evaluate deep ...
research
12/13/2016

Theory and Tools for the Conversion of Analog to Spiking Convolutional Neural Networks

Deep convolutional neural networks (CNNs) have shown great potential for...

Please sign up or login with your details

Forgot password? Click here to reset