Rethinking Data-Free Quantization as a Zero-Sum Game

02/19/2023
by   Biao Qian, et al.
0

Data-free quantization (DFQ) recovers the performance of quantized network (Q) without accessing the real data, but generates the fake sample via a generator (G) by learning from full-precision network (P) instead. However, such sample generation process is totally independent of Q, specialized as failing to consider the adaptability of the generated samples, i.e., beneficial or adversarial, over the learning process of Q, resulting into non-ignorable performance loss. Building on this, several crucial questions – how to measure and exploit the sample adaptability to Q under varied bit-width scenarios? how to generate the samples with desirable adaptability to benefit the quantized network? – impel us to revisit DFQ. In this paper, we answer the above questions from a game-theory perspective to specialize DFQ as a zero-sum game between two players – a generator and a quantized network, and further propose an Adaptability-aware Sample Generation (AdaSG) method. Technically, AdaSG reformulates DFQ as a dynamic maximization-vs-minimization game process anchored on the sample adaptability. The maximization process aims to generate the sample with desirable adaptability, such sample adaptability is further reduced by the minimization process after calibrating Q for performance recovery. The Balance Gap is defined to guide the stationarity of the game process to maximally benefit Q. The theoretical analysis and empirical studies verify the superiority of AdaSG over the state-of-the-arts. Our code is available at https://github.com/hfutqian/AdaSG.

READ FULL TEXT
research
03/13/2023

Adaptive Data-Free Quantization

Data-free quantization (DFQ) recovers the performance of quantized netwo...
research
09/01/2021

Diverse Sample Generation: Pushing the Limit of Data-free Quantization

Recently, generative data-free quantization emerges as a practical appro...
research
03/29/2021

Zero-shot Adversarial Quantization

Model quantization is a promising approach to compress deep neural netwo...
research
11/04/2021

Qimera: Data-free Quantization with Synthetic Boundary Supporting Samples

Model quantization is known as a promising method to compress deep neura...
research
01/18/2023

ACQ: Improving Generative Data-free Quantization Via Attention Correction

Data-free quantization aims to achieve model quantization without access...
research
08/05/2021

Generalizable Mixed-Precision Quantization via Attribution Rank Preservation

In this paper, we propose a generalizable mixed-precision quantization (...
research
10/26/2021

Qu-ANTI-zation: Exploiting Quantization Artifacts for Achieving Adversarial Outcomes

Quantization is a popular technique that transforms the parameter repres...

Please sign up or login with your details

Forgot password? Click here to reset