Butterfly: Multiple Reference Frames Feature Propagation Mechanism for Neural Video Compression

03/06/2023
by   Feng Wang, et al.
0

Using more reference frames can significantly improve the compression efficiency in neural video compression. However, in low-latency scenarios, most existing neural video compression frameworks usually use the previous one frame as reference. Or a few frameworks which use the previous multiple frames as reference only adopt a simple multi-reference frames propagation mechanism. In this paper, we present a more reasonable multi-reference frames propagation mechanism for neural video compression, called butterfly multi-reference frame propagation mechanism (Butterfly), which allows a more effective feature fusion of multi-reference frames. By this, we can generate more accurate temporal context conditional prior for Contextual Coding Module. Besides, when the number of decoded frames does not meet the required number of reference frames, we duplicate the nearest reference frame to achieve the requirement, which is better than duplicating the furthest one. Experiment results show that our method can significantly outperform the previous state-of-the-art (SOTA), and our neural codec can achieve -7.6 compares with our base single-reference frame model with the same compression configuration.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2021

Extending Neural P-frame Codecs for B-frame Coding

While most neural video codecs address P-frame coding (predicting each f...
research
09/30/2021

Deep Contextual Video Compression

Most of the existing neural video compression methods adopt the predicti...
research
08/15/2023

Shortcut-V2V: Compression Framework for Video-to-Video Translation based on Temporal Redundancy Reduction

Video-to-video translation aims to generate video frames of a target dom...
research
12/08/2017

Variational models for joint subsampling and reconstruction of turbulence-degraded images

Turbulence-degraded image frames are distorted by both turbulent deforma...
research
06/21/2021

PHYSFRAME: Type Checking Physical Frames of Reference for Robotic Systems

A robotic system continuously measures its own motions and the external ...
research
12/05/2022

BiSTNet: Semantic Image Prior Guided Bidirectional Temporal Feature Fusion for Deep Exemplar-based Video Colorization

How to effectively explore the colors of reference exemplars and propaga...
research
11/01/2017

Spatio-Temporal Reference Frames as Geographic Objects

It is often desirable to analyse trajectory data in local coordinates re...

Please sign up or login with your details

Forgot password? Click here to reset