Fine-grained Data Distribution Alignment for Post-Training Quantization

09/09/2021
by   Yunshan Zhong, et al.
0

While post-training quantization receives popularity mostly due to its evasion in accessing the original complete training dataset, its poor performance also stems from this limitation. To alleviate this limitation, in this paper, we leverage the synthetic data introduced by zero-shot quantization with calibration dataset and we propose a fine-grained data distribution alignment (FDDA) method to boost the performance of post-training quantization. The method is based on two important properties of batch normalization statistics (BNS) we observed in deep layers of the trained network, i.e., inter-class separation and intra-class incohesion. To preserve this fine-grained distribution information: 1) We calculate the per-class BNS of the calibration dataset as the BNS centers of each class and propose a BNS-centralized loss to force the synthetic data distributions of different classes to be close to their own centers. 2) We add Gaussian noise into the centers to imitate the incohesion and propose a BNS-distorted loss to force the synthetic data distribution of the same class to be close to the distorted centers. By introducing these two fine-grained losses, our method shows the state-of-the-art performance on ImageNet, especially when the first and last layers are quantized to low-bit as well. Our project is available at https://github.com/viperit/FDDA.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2022

ClusterQ: Semantic Feature Distribution Alignment for Data-Free Quantization

Network quantization has emerged as a promising method for model compres...
research
05/10/2023

Post-training Model Quantization Using GANs for Synthetic Data Generation

Quantization is a widely adopted technique for deep neural networks to r...
research
11/13/2022

Long-Range Zero-Shot Generative Deep Network Quantization

Quantization approximates a deep network model with floating-point numbe...
research
03/15/2023

A Comprehensive Study on Post-Training Quantization for Large Language Models

Post-training quantization () had been recently shown as a compromising ...
research
11/17/2021

IntraQ: Learning Synthetic Images with Intra-Class Heterogeneity for Zero-Shot Network Quantization

Learning to synthesize data has emerged as a promising direction in zero...
research
03/24/2022

The Fixed Sub-Center: A Better Way to Capture Data Complexity

Treating class with a single center may hardly capture data distribution...
research
08/03/2022

Convolutional Fine-Grained Classification with Self-Supervised Target Relation Regularization

Fine-grained visual classification can be addressed by deep representati...

Please sign up or login with your details

Forgot password? Click here to reset