Content-Variant Reference Image Quality Assessment via Knowledge Distillation

02/26/2022
by   Guanghao Yin, et al.
0

Generally, humans are more skilled at perceiving differences between high-quality (HQ) and low-quality (LQ) images than directly judging the quality of a single LQ image. This situation also applies to image quality assessment (IQA). Although recent no-reference (NR-IQA) methods have made great progress to predict image quality free from the reference image, they still have the potential to achieve better performance since HQ image information is not fully exploited. In contrast, full-reference (FR-IQA) methods tend to provide more reliable quality evaluation, but its practicability is affected by the requirement for pixel-level aligned reference images. To address this, we firstly propose the content-variant reference method via knowledge distillation (CVRKD-IQA). Specifically, we use non-aligned reference (NAR) images to introduce various prior distributions of high-quality images. The comparisons of distribution differences between HQ and LQ images can help our model better assess the image quality. Further, the knowledge distillation transfers more HQ-LQ distribution difference information from the FR-teacher to the NAR-student and stabilizing CVRKD-IQA performance. Moreover, to fully mine the local-global combined information, while achieving faster inference speed, our model directly processes multiple image patches from the input with the MLP-mixer. Cross-dataset experiments verify that our model can outperform all NAR/NR-IQA SOTAs, even reach comparable performance with FR-IQA methods on some occasions. Since the content-variant and non-aligned reference HQ images are easy to obtain, our model can support more IQA applications with its relative robustness to content variations. Our code and more detailed elaborations of supplements are available: https://github.com/guanghaoyin/CVRKD-IQA.

READ FULL TEXT

page 1

page 3

page 7

research
05/02/2022

Conformer and Blind Noisy Students for Improved Image Quality Assessment

Generative models for image restoration, enhancement, and generation hav...
research
08/18/2021

Learning Conditional Knowledge Distillation for Degraded-Reference Image Quality Assessment

An important scenario for image quality assessment (IQA) is to evaluate ...
research
04/07/2023

Test your samples jointly: Pseudo-reference for image quality evaluation

In this paper, we address the well-known image quality assessment proble...
research
10/10/2022

Distill the Image to Nowhere: Inversion Knowledge Distillation for Multimodal Machine Translation

Past works on multimodal machine translation (MMT) elevate bilingual set...
research
03/08/2022

PyNET-QxQ: A Distilled PyNET for QxQ Bayer Pattern Demosaicing in CMOS Image Sensor

The deep learning-based ISP models for mobile cameras produce high-quali...
research
04/14/2020

Towards Robust Classification with Image Quality Assessment

Recent studies have shown that deep convolutional neural networks (DCNN)...
research
04/10/2019

Image Quality Assessment for Omnidirectional Cross-reference Stitching

Along with the development of virtual reality (VR), omnidirectional imag...

Please sign up or login with your details

Forgot password? Click here to reset