Dynamic Dual Trainable Bounds for Ultra-low Precision Super-Resolution Networks

03/08/2022
by   Yunshan Zhong, et al.
0

Light-weight super-resolution (SR) models have received considerable attention for their serviceability in mobile devices. Many efforts employ network quantization to compress SR models. However, these methods suffer from severe performance degradation when quantizing the SR models to ultra-low precision (e.g., 2-bit and 3-bit) with the low-cost layer-wise quantizer. In this paper, we identify that the performance drop comes from the contradiction between the layer-wise symmetric quantizer and the highly asymmetric activation distribution in SR models. This discrepancy leads to either a waste on the quantization levels or detail loss in reconstructed images. Therefore, we propose a novel activation quantizer, referred to as Dynamic Dual Trainable Bounds (DDTB), to accommodate the asymmetry of the activations. Specifically, DDTB innovates in: 1) A layer-wise quantizer with trainable upper and lower bounds to tackle the highly asymmetric activations. 2) A dynamic gate controller to adaptively adjust the upper and lower bounds at runtime to overcome the drastically varying activation ranges over different samples.To reduce the extra overhead, the dynamic gate controller is quantized to 2-bit and applied to only part of the SR networks according to the introduced dynamic intensity. Extensive experiments demonstrate that our DDTB exhibits significant performance improvements in ultra-low precision. For example, our DDTB achieves a 0.70dB PSNR increase on Urban100 benchmark when quantizing EDSR to 2-bit and scaling up output images to x4. Code is at <https://github.com/zysxmu/DDTB>.

READ FULL TEXT

page 13

page 21

page 22

page 23

research
11/09/2020

PAMS: Quantized Super-Resolution via Parameterized Max Scale

Deep convolutional neural networks (DCNNs) have shown dominant performan...
research
07/21/2022

CADyQ: Content-Aware Dynamic Quantization for Image Super-Resolution

Despite breakthrough advances in image super-resolution (SR) with convol...
research
08/22/2023

Towards Clip-Free Quantized Super-Resolution Networks: How to Tame Representative Images

Super-resolution (SR) networks have been investigated for a while, with ...
research
05/10/2023

Distribution-Flexible Subset Quantization for Post-Quantizing Super-Resolution Networks

This paper introduces Distribution-Flexible Subset Quantization (DFSQ), ...
research
11/29/2020

Fully Quantized Image Super-Resolution Networks

With the rising popularity of intelligent mobile devices, it is of great...
research
07/25/2023

Overcoming Distribution Mismatch in Quantizing Image Super-Resolution Networks

Quantization is a promising approach to reduce the high computational co...
research
04/20/2020

LSQ+: Improving low-bit quantization through learnable offsets and better initialization

Unlike ReLU, newer activation functions (like Swish, H-swish, Mish) that...

Please sign up or login with your details

Forgot password? Click here to reset