Bitwidth-Adaptive Quantization-Aware Neural Network Training: A Meta-Learning Approach

07/20/2022
by   Jiseok Youn, et al.
0

Deep neural network quantization with adaptive bitwidths has gained increasing attention due to the ease of model deployment on various platforms with different resource budgets. In this paper, we propose a meta-learning approach to achieve this goal. Specifically, we propose MEBQAT, a simple yet effective way of bitwidth-adaptive quantization aware training (QAT) where meta-learning is effectively combined with QAT by redefining meta-learning tasks to incorporate bitwidths. After being deployed on a platform, MEBQAT allows the (meta-)trained model to be quantized to any candidate bitwidth then helps to conduct inference without much accuracy drop from quantization. Moreover, with a few-shot learning scenario, MEBQAT can also adapt a model to any bitwidth as well as any unseen target classes by adding conventional optimization or metric-based meta-learning. We design variants of MEBQAT to support both (1) a bitwidth-adaptive quantization scenario and (2) a new few-shot learning scenario where both quantization bitwidths and target classes are jointly adapted. We experimentally demonstrate their validity in multiple QAT schemes. By comparing their performance to (bitwidth-dedicated) QAT, existing bitwidth adaptive QAT and vanilla meta-learning, we find that merging bitwidths into meta-learning tasks achieves a higher level of robustness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/09/2020

A New Meta-Baseline for Few-Shot Learning

Meta-learning has become a popular framework for few-shot learning in re...
research
11/28/2020

Is Support Set Diversity Necessary for Meta-Learning?

Meta-learning is a popular framework for learning with limited data in w...
research
02/20/2020

A Structured Prediction Approach for Conditional Meta-Learning

Optimization-based meta-learning algorithms are a powerful class of meth...
research
10/12/2022

A Unified Framework with Meta-dropout for Few-shot Learning

Conventional training of deep neural networks usually requires a substan...
research
04/24/2020

Automatic low-bit hybrid quantization of neural networks through meta learning

Model quantization is a widely used technique to compress and accelerate...
research
04/06/2023

Learning to Learn with Indispensable Connections

Meta-learning aims to solve unseen tasks with few labelled instances. Ne...
research
01/19/2023

A Meta-Learning Approach for Software Refactoring

Software refactoring is the process of changing the structure of softwar...

Please sign up or login with your details

Forgot password? Click here to reset