Quantization in Layer's Input is Matter

02/10/2022
by   Daning Cheng, et al.
0

In this paper, we will show that the quantization in layer's input is more important than parameters' quantization for loss function. And the algorithm which is based on the layer's input quantization error is better than hessian-based mixed precision layout algorithm.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset