Adaptive Data-Free Quantization

03/13/2023
by   Biao Qian, et al.
0

Data-free quantization (DFQ) recovers the performance of quantized network (Q) without accessing the real data, but generates the fake sample via a generator (G) by learning from full-precision network (P) instead. However, such sample generation process is totally independent of Q, overlooking the adaptability of the knowledge from generated samples, i.e., informative or not to the learning process of Q, resulting into the overflow of generalization error. Building on this, several critical questions – how to measure the sample adaptability to Q under varied bit-width scenarios? how to generate the samples with large adaptability to improve Q's generalization? whether the largest adaptability is the best? To answer the above questions, in this paper, we propose an Adaptive Data-Free Quantization (AdaDFQ) method, which reformulates DFQ as a zero-sum game upon the sample adaptability between two players – a generator and a quantized network. Following this viewpoint, we further define the disagreement and agreement samples to form two boundaries, where the margin is optimized to address the over-and-under fitting issues, so as to generate the samples with the desirable adaptability to Q. Our AdaDFQ reveals: 1) the largest adaptability is NOT the best for sample generation to benefit Q's generalization; 2) the knowledge of the generated sample should not be informative to Q only, but also related to the category and distribution information of the training data for P. The theoretical and empirical analysis validate the advantages of AdaDFQ over the state-of-the-arts. Our code is available at https: github.com/hfutqian/AdaDFQ.

READ FULL TEXT
research
02/19/2023

Rethinking Data-Free Quantization as a Zero-Sum Game

Data-free quantization (DFQ) recovers the performance of quantized netwo...
research
09/01/2021

Diverse Sample Generation: Pushing the Limit of Data-free Quantization

Recently, generative data-free quantization emerges as a practical appro...
research
11/04/2021

Qimera: Data-free Quantization with Synthetic Boundary Supporting Samples

Model quantization is known as a promising method to compress deep neura...
research
03/24/2023

Hard Sample Matters a Lot in Zero-Shot Quantization

Zero-shot quantization (ZSQ) is promising for compressing and accelerati...
research
03/01/2021

Diversifying Sample Generation for Accurate Data-Free Quantization

Quantization has emerged as one of the most prevalent approaches to comp...
research
05/30/2023

Towards Accurate Data-free Quantization for Diffusion Models

In this paper, we propose an accurate data-free post-training quantizati...
research
09/19/2021

JEM++: Improved Techniques for Training JEM

Joint Energy-based Model (JEM) is a recently proposed hybrid model that ...

Please sign up or login with your details

Forgot password? Click here to reset