ACQ: Improving Generative Data-free Quantization Via Attention Correction

01/18/2023
by   Jixing Li, et al.
0

Data-free quantization aims to achieve model quantization without accessing any authentic sample. It is significant in an application-oriented context involving data privacy. Converting noise vectors into synthetic samples through a generator is a popular data-free quantization method, which is called generative data-free quantization. However, there is a difference in attention between synthetic samples and authentic samples. This is always ignored and restricts the quantization performance. First, since synthetic samples of the same class are prone to have homogenous attention, the quantized network can only learn limited modes of attention. Second, synthetic samples in eval mode and training mode exhibit different attention. Hence, the batch-normalization statistics matching tends to be inaccurate. ACQ is proposed in this paper to fix the attention of synthetic samples. An attention center position-condition generator is established regarding the homogenization of intra-class attention. Restricted by the attention center matching loss, the attention center position is treated as the generator's condition input to guide synthetic samples in obtaining diverse attention. Moreover, we design adversarial loss of paired synthetic samples under the same condition to prevent the generator from paying overmuch attention to the condition, which may result in mode collapse. To improve the attention similarity of synthetic samples in different network modes, we introduce a consistency penalty to guarantee accurate BN statistics matching. The experimental results demonstrate that ACQ effectively improves the attention problems of synthetic samples. Under various training settings, ACQ achieves the best quantization performance. For the 4-bit quantization of Resnet18 and Resnet50, ACQ reaches 67.55

READ FULL TEXT

page 3

page 4

page 5

page 18

page 19

research
09/01/2021

Diverse Sample Generation: Pushing the Limit of Data-free Quantization

Recently, generative data-free quantization emerges as a practical appro...
research
05/08/2020

Data-Free Network Quantization With Adversarial Knowledge Distillation

Network quantization is an essential procedure in deep learning for deve...
research
11/04/2021

Qimera: Data-free Quantization with Synthetic Boundary Supporting Samples

Model quantization is known as a promising method to compress deep neura...
research
03/01/2021

Diversifying Sample Generation for Accurate Data-Free Quantization

Quantization has emerged as one of the most prevalent approaches to comp...
research
11/13/2022

Long-Range Zero-Shot Generative Deep Network Quantization

Quantization approximates a deep network model with floating-point numbe...
research
02/19/2023

Rethinking Data-Free Quantization as a Zero-Sum Game

Data-free quantization (DFQ) recovers the performance of quantized netwo...
research
10/14/2020

Towards Accurate Quantization and Pruning via Data-free Knowledge Transfer

When large scale training data is available, one can obtain compact and ...

Please sign up or login with your details

Forgot password? Click here to reset