Distribution-Flexible Subset Quantization for Post-Quantizing Super-Resolution Networks

05/10/2023
by   Yunshan Zhong, et al.
0

This paper introduces Distribution-Flexible Subset Quantization (DFSQ), a post-training quantization method for super-resolution networks. Our motivation for developing DFSQ is based on the distinctive activation distributions of current super-resolution models, which exhibit significant variance across samples and channels. To address this issue, DFSQ conducts channel-wise normalization of the activations and applies distribution-flexible subset quantization (SQ), wherein the quantization points are selected from a universal set consisting of multi-word additive log-scale values. To expedite the selection of quantization points in SQ, we propose a fast quantization points selection strategy that uses K-means clustering to select the quantization points closest to the centroids. Compared to the common iterative exhaustive search algorithm, our strategy avoids the enumeration of all possible combinations in the universal set, reducing the time complexity from exponential to linear. Consequently, the constraint of time costs on the size of the universal set is greatly relaxed. Extensive evaluations of various super-resolution models show that DFSQ effectively retains performance even without fine-tuning. For example, when quantizing EDSRx2 on the Urban benchmark, DFSQ achieves comparable performance to full-precision counterparts on 6- and 8-bit quantization, and incurs only a 0.1 dB PSNR drop on 4-bit quantization. Code is at <https://github.com/zysxmu/DFSQ>

READ FULL TEXT

page 14

page 15

research
12/21/2020

DAQ: Distribution-Aware Quantization for Deep Image Super-Resolution Networks

Quantizing deep convolutional neural networks for image super-resolution...
research
07/25/2023

Overcoming Distribution Mismatch in Quantizing Image Super-Resolution Networks

Quantization is a promising approach to reduce the high computational co...
research
03/08/2022

Dynamic Dual Trainable Bounds for Ultra-low Precision Super-Resolution Networks

Light-weight super-resolution (SR) models have received considerable att...
research
02/26/2021

Quantization for spectral super-resolution

We show that the method of distributed noise-shaping beta-quantization o...
research
01/31/2019

High-performance quantization for spectral super-resolution

We show that the method of distributed noise-shaping beta-quantization o...
research
07/31/2023

Lightweight Super-Resolution Head for Human Pose Estimation

Heatmap-based methods have become the mainstream method for pose estimat...
research
11/29/2022

NoisyQuant: Noisy Bias-Enhanced Post-Training Activation Quantization for Vision Transformers

The complicated architecture and high training cost of vision transforme...

Please sign up or login with your details

Forgot password? Click here to reset