Data-Free Quantization with Accurate Activation Clipping and Adaptive Batch Normalization

04/08/2022
by   Yefei He, et al.
0

Data-free quantization is a task that compresses the neural network to low bit-width without access to original training data. Most existing data-free quantization methods cause severe performance degradation due to inaccurate activation clipping range and quantization error, especially for low bit-width. In this paper, we present a simple yet effective data-free quantization method with accurate activation clipping and adaptive batch normalization. Accurate activation clipping (AAC) improves the model accuracy by exploiting accurate activation information from the full-precision model. Adaptive batch normalization firstly proposes to address the quantization error from distribution changes by updating the batch normalization layer adaptively. Extensive experiments demonstrate that the proposed data-free quantization method can yield surprisingly performance, achieving 64.33 ResNet18 on ImageNet dataset, with 3.7 existing state-of-the-art methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset