QReg: On Regularization Effects of Quantization

In this paper we study the effects of quantization in DNN training. We hypothesize that weight quantization is a form of regularization and the amount of regularization is correlated with the quantization level (precision). We confirm our hypothesis by providing analytical study and empirical results. By modeling weight quantization as a form of additive noise to weights, we explore how this noise propagates through the network at training time. We then show that the magnitude of this noise is correlated with the level of quantization. To confirm our analytical study, we performed an extensive list of experiments summarized in this paper in which we show that the regularization effects of quantization can be seen in various vision tasks and models, over various datasets. Based on our study, we propose that 8-bit quantization provides a reliable form of regularization in different vision tasks and models.

READ FULL TEXT

page 3

page 8

page 9

research
07/31/2022

Symmetry Regularization and Saturating Nonlinearity for Robust Quantization

Robust quantization improves the tolerance of networks for various imple...
research
03/10/2021

Quantization-Guided Training for Compact TinyML Models

We propose a Quantization Guided Training (QGT) method to guide DNN trai...
research
10/30/2020

Time regularization as a solution to mitigate quantization induced performance degradation

Reset control is known to be able to outperform PID and the like linear ...
research
05/24/2023

RAND: Robustness Aware Norm Decay For Quantized Seq2seq Models

With the rapid increase in the size of neural networks, model compressio...
research
03/14/2023

R^2: Range Regularization for Model Compression and Quantization

Model parameter regularization is a widely used technique to improve gen...
research
06/02/2022

NIPQ: Noise Injection Pseudo Quantization for Automated DNN Optimization

The optimization of neural networks in terms of computation cost and mem...
research
09/05/2021

Cluster-Promoting Quantization with Bit-Drop for Minimizing Network Quantization Loss

Network quantization, which aims to reduce the bit-lengths of the networ...

Please sign up or login with your details

Forgot password? Click here to reset