Perceptual Quality Assessment of Immersive Images Considering Peripheral Vision Impact
Conventional images/videos are often rendered within the central vision area of the human visual system (HVS) with uniform quality. Recent virtual reality (VR) device with head mounted display (HMD) extends the field of view (FoV) significantly to include both central and peripheral vision areas. It exhibits the unequal image quality sensation among these areas because of the non-uniform distribution of photoreceptors on our retina. We propose to study the sensation impact on the image subjective quality with respect to the eccentric angle θ across different vision areas. Often times, image quality is controlled by the quantization stepsize q and spatial resolution s, separately and jointly. Therefore, the sensation impact can be understood by exploring the q and/or s in terms of the θ, resulting in self-adaptive analytical models that have shown quite impressive accuracy through independent cross validations. These models can further be applied to give different quality weights at different regions, so as to significantly reduce the transmission data size but without subjective quality loss. As demonstrated in a gigapixel imaging system, we have shown that the image rendering can be speed up about 10× with the model guided unequal quality scales, in comparison to the the legacy scheme with uniform quality scales everywhere.
READ FULL TEXT