Overcoming Distribution Mismatch in Quantizing Image Super-Resolution Networks

07/25/2023
by   Cheeun Hong, et al.
0

Quantization is a promising approach to reduce the high computational complexity of image super-resolution (SR) networks. However, compared to high-level tasks like image classification, low-bit quantization leads to severe accuracy loss in SR networks. This is because feature distributions of SR networks are significantly divergent for each channel or input image, and is thus difficult to determine a quantization range. Existing SR quantization works approach this distribution mismatch problem by dynamically adapting quantization ranges to the variant distributions during test time. However, such dynamic adaptation incurs additional computational costs that limit the benefits of quantization. Instead, we propose a new quantization-aware training framework that effectively Overcomes the Distribution Mismatch problem in SR networks without the need for dynamic adaptation. Intuitively, the mismatch can be reduced by directly regularizing the variance in features during training. However, we observe that variance regularization can collide with the reconstruction loss during training and adversely impact SR accuracy. Thus, we avoid the conflict between two losses by regularizing the variance only when the gradients of variance regularization are cooperative with that of reconstruction. Additionally, to further reduce the distribution mismatch, we introduce distribution offsets to layers with a significant mismatch, which either scales or shifts channel-wise features. Our proposed algorithm, called ODM, effectively reduces the mismatch in distributions with minimal computational overhead. Experimental results show that ODM effectively outperforms existing SR quantization approaches with similar or fewer computations, demonstrating the importance of reducing the distribution mismatch problem. Our code is available at https://github.com/Cheeun/ODM.

READ FULL TEXT

page 9

page 15

research
07/21/2022

CADyQ: Content-Aware Dynamic Quantization for Image Super-Resolution

Despite breakthrough advances in image super-resolution (SR) with convol...
research
12/21/2020

DAQ: Distribution-Aware Quantization for Deep Image Super-Resolution Networks

Quantizing deep convolutional neural networks for image super-resolution...
research
07/31/2023

Lightweight Super-Resolution Head for Human Pose Estimation

Heatmap-based methods have become the mainstream method for pose estimat...
research
05/10/2023

Distribution-Flexible Subset Quantization for Post-Quantizing Super-Resolution Networks

This paper introduces Distribution-Flexible Subset Quantization (DFSQ), ...
research
05/20/2021

Anchor-based Plain Net for Mobile Image Super-Resolution

Along with the rapid development of real-world applications, higher requ...
research
05/10/2018

Achieving Super-Resolution with Redundant Sensing

Analog-to-digital (quantization) and digital-to-analog (de-quantization)...
research
03/08/2022

Dynamic Dual Trainable Bounds for Ultra-low Precision Super-Resolution Networks

Light-weight super-resolution (SR) models have received considerable att...

Please sign up or login with your details

Forgot password? Click here to reset