DeepAI AI Chat
Log In Sign Up

Back-and-Forth prediction for deep tensor compression

by   Hyomin Choi, et al.

Recent AI applications such as Collaborative Intelligence with neural networks involve transferring deep feature tensors between various computing devices. This necessitates tensor compression in order to optimize the usage of bandwidth-constrained channels between devices. In this paper we present a prediction scheme called Back-and-Forth (BaF) prediction, developed for deep feature tensors, which allows us to dramatically reduce tensor size and improve its compressibility. Our experiments with a state-of-the-art object detector demonstrate that the proposed method allows us to significantly reduce the number of bits needed for compressing feature tensors extracted from deep within the model, with negligible degradation of the detection performance and without requiring any retraining of the network weights. We achieve a 62 75 to less than 1


page 1

page 2

page 3

page 4


T-Basis: a Compact Representation for Neural Networks

We introduce T-Basis, a novel concept for a compact representation of a ...

Composite Binary Decomposition Networks

Binary neural networks have great resource and computing efficiency, whi...

Analysis of Latent-Space Motion for Collaborative Intelligence

When the input to a deep neural network (DNN) is a video signal, a seque...

ByteComp: Revisiting Gradient Compression in Distributed Training

Gradient compression (GC) is a promising approach to addressing the comm...

OGRe: An Object-Oriented General Relativity Package for Mathematica

We present OGRe, a modern Mathematica package for tensor calculus, desig...

ADA-Tucker: Compressing Deep Neural Networks via Adaptive Dimension Adjustment Tucker Decomposition

Despite the recent success of deep learning models in numerous applicati...

Tensor-based approach to accelerate deformable part models

This article provides next step towards solving speed bottleneck of any ...