Back-and-Forth prediction for deep tensor compression

02/14/2020
by   Hyomin Choi, et al.
0

Recent AI applications such as Collaborative Intelligence with neural networks involve transferring deep feature tensors between various computing devices. This necessitates tensor compression in order to optimize the usage of bandwidth-constrained channels between devices. In this paper we present a prediction scheme called Back-and-Forth (BaF) prediction, developed for deep feature tensors, which allows us to dramatically reduce tensor size and improve its compressibility. Our experiments with a state-of-the-art object detector demonstrate that the proposed method allows us to significantly reduce the number of bits needed for compressing feature tensors extracted from deep within the model, with negligible degradation of the detection performance and without requiring any retraining of the network weights. We achieve a 62 75 to less than 1

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2020

T-Basis: a Compact Representation for Neural Networks

We introduce T-Basis, a novel concept for a compact representation of a ...
research
11/16/2018

Composite Binary Decomposition Networks

Binary neural networks have great resource and computing efficiency, whi...
research
02/08/2021

Analysis of Latent-Space Motion for Collaborative Intelligence

When the input to a deep neural network (DNN) is a video signal, a seque...
research
05/28/2022

ByteComp: Revisiting Gradient Compression in Distributed Training

Gradient compression (GC) is a promising approach to addressing the comm...
research
05/20/2021

Error Resilient Collaborative Intelligence via Low-Rank Tensor Completion

In the race to bring Artificial Intelligence (AI) to the edge, collabora...
research
06/18/2019

ADA-Tucker: Compressing Deep Neural Networks via Adaptive Dimension Adjustment Tucker Decomposition

Despite the recent success of deep learning models in numerous applicati...

Please sign up or login with your details

Forgot password? Click here to reset