Zero-shot Adversarial Quantization

03/29/2021
by   Yuang Liu, et al.
13

Model quantization is a promising approach to compress deep neural networks and accelerate inference, making it possible to be deployed on mobile and edge devices. To retain the high performance of full-precision models, most existing quantization methods focus on fine-tuning quantized model by assuming training datasets are accessible. However, this assumption sometimes is not satisfied in real situations due to data privacy and security issues, thereby making these quantization methods not applicable. To achieve zero-short model quantization without accessing training data, a tiny number of quantization methods adopt either post-training quantization or batch normalization statistics-guided data generation for fine-tuning. However, both of them inevitably suffer from low performance, since the former is a little too empirical and lacks training support for ultra-low precision quantization, while the latter could not fully restore the peculiarities of original data and is often low efficient for diverse data generation. To address the above issues, we propose a zero-shot adversarial quantization (ZAQ) framework, facilitating effective discrepancy estimation and knowledge transfer from a full-precision model to its quantized model. This is achieved by a novel two-level discrepancy modeling to drive a generator to synthesize informative and diverse data examples to optimize the quantized model in an adversarial learning fashion. We conduct extensive experiments on three fundamental vision tasks, demonstrating the superiority of ZAQ over the strong zero-shot baselines and validating the effectiveness of its main components. Code is available at <https://git.io/Jqc0y>.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 7

page 9

page 10

research
01/01/2020

ZeroQ: A Novel Zero Shot Quantization Framework

Quantization is a promising approach for reducing the inference time and...
research
12/09/2022

Genie: Show Me the Data for Quantization

Zero-shot quantization is a promising approach for developing lightweigh...
research
03/24/2023

Hard Sample Matters a Lot in Zero-Shot Quantization

Zero-shot quantization (ZSQ) is promising for compressing and accelerati...
research
07/02/2023

Data-Free Quantization via Mixed-Precision Compensation without Fine-Tuning

Neural network quantization is a very promising solution in the field of...
research
11/04/2021

Qimera: Data-free Quantization with Synthetic Boundary Supporting Samples

Model quantization is known as a promising method to compress deep neura...
research
03/31/2022

It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher

Model quantization is considered as a promising method to greatly reduce...
research
02/19/2023

Rethinking Data-Free Quantization as a Zero-Sum Game

Data-free quantization (DFQ) recovers the performance of quantized netwo...

Please sign up or login with your details

Forgot password? Click here to reset