Simple Modulo can Significantly Outperform Deep Learning-based Deepcode

08/04/2020
by   Assaf Ben-Yishai, et al.
0

Deepcode (H.Kim et al.2018) is a recently suggested Deep Learning-based scheme for communication over the AWGN channel with noisy feedback, claimed to be superior to all previous schemes in the literature. Deepcode's use of nonlinear coding (via Deep Learning) has been inspired by known shortcomings (Y.-H. Kim et al 2007) of linear feedback schemes. In 2014, we presented a nonlinear feedback coding scheme based on a combination of the classical SK scheme and modulo-arithmetic, using a small number of elementary operations without any type of neural network. This Modulo-SK scheme has been omitted from the performance comparisons made in the Deepcode paper, due to its use of common randomness (dither), and in a later version since it was incorrectly interpreted as a variable-length coding scheme. However, the dither in Modulo-SK was used only for the standard purpose of tractable performance analysis, and is not required in practice. In this short note, we show that a fully-deterministic Modulo-SK (without dithering) can outperform Deepcode. For example, to attain an error probability of 10^(-4) at rate 1/3 Modulo-SK requires 3dB less feedback SNR than Deepcode. To attain an error probability of 10^(-6) with noiseless feedback, Deepcode requires 150 rounds of communication, whereas Modulo-SK requires only 15 rounds, even if the feedback is noisy (with 27dB SNR). We further address the numerical stability issues of the original SK scheme reported in the Deepcode paper, and explain how they can be avoided. We augment this report with an online-available, fully-functional Matlab simulation for both the classical and Modulo-SK schemes. Finally, note that Modulo-SK is by no means claimed to be the best possible solution; in particular, using deep learning in conjunction with modulo-arithmetic might lead to better designs, and remains a fascinating direction for future research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/12/2020

Feedback Channel Communication with Low Precision Arithmetic

The problem of communicating over an additive white Gaussian noise chann...
research
11/25/2019

DeepJSCC-f: Deep Joint-Source Channel Coding of Images with Feedback

We consider wireless transmission of images in the presence of channel o...
research
02/04/2021

A Practical Coding Scheme for the BSC with Feedback

We provide a practical implementation of the rubber method of Ahlswede e...
research
02/02/2019

Finite-Blocklength Performance of Sequential Transmission over BSC with Noiseless Feedback

In this paper, we consider the expected blocklength of variable-length c...
research
06/19/2022

All you need is feedback: Communication with block attention feedback codes

Deep learning based channel code designs have recently gained interest a...
research
11/03/2022

Feedback is Good, Active Feedback is Better: Block Attention Active Feedback Codes

Deep neural network (DNN)-assisted channel coding designs, such as low-c...
research
06/01/2023

Do not Interfere but Cooperate: A Fully Learnable Code Design for Multi-Access Channels with Feedback

Data-driven deep learning based code designs, including low-complexity n...

Please sign up or login with your details

Forgot password? Click here to reset