Simple Modulo can Significantly Outperform Deep Learning-based Deepcode

08/04/2020
by   Assaf Ben-Yishai, et al.
0

Deepcode (H.Kim et al.2018) is a recently suggested Deep Learning-based scheme for communication over the AWGN channel with noisy feedback, claimed to be superior to all previous schemes in the literature. Deepcode's use of nonlinear coding (via Deep Learning) has been inspired by known shortcomings (Y.-H. Kim et al 2007) of linear feedback schemes. In 2014, we presented a nonlinear feedback coding scheme based on a combination of the classical SK scheme and modulo-arithmetic, using a small number of elementary operations without any type of neural network. This Modulo-SK scheme has been omitted from the performance comparisons made in the Deepcode paper, due to its use of common randomness (dither), and in a later version since it was incorrectly interpreted as a variable-length coding scheme. However, the dither in Modulo-SK was used only for the standard purpose of tractable performance analysis, and is not required in practice. In this short note, we show that a fully-deterministic Modulo-SK (without dithering) can outperform Deepcode. For example, to attain an error probability of 10^(-4) at rate 1/3 Modulo-SK requires 3dB less feedback SNR than Deepcode. To attain an error probability of 10^(-6) with noiseless feedback, Deepcode requires 150 rounds of communication, whereas Modulo-SK requires only 15 rounds, even if the feedback is noisy (with 27dB SNR). We further address the numerical stability issues of the original SK scheme reported in the Deepcode paper, and explain how they can be avoided. We augment this report with an online-available, fully-functional Matlab simulation for both the classical and Modulo-SK schemes. Finally, note that Modulo-SK is by no means claimed to be the best possible solution; in particular, using deep learning in conjunction with modulo-arithmetic might lead to better designs, and remains a fascinating direction for future research.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

01/12/2020

Feedback Channel Communication with Low Precision Arithmetic

The problem of communicating over an additive white Gaussian noise chann...
11/25/2019

DeepJSCC-f: Deep Joint-Source Channel Coding of Images with Feedback

We consider wireless transmission of images in the presence of channel o...
02/04/2021

A Practical Coding Scheme for the BSC with Feedback

We provide a practical implementation of the rubber method of Ahlswede e...
02/02/2019

Finite-Blocklength Performance of Sequential Transmission over BSC with Noiseless Feedback

In this paper, we consider the expected blocklength of variable-length c...
10/28/2021

DeepNP: Deep Learning-Based Noise Prediction for Ultra-Reliable Low-Latency Communications

Closing the gap between high data rates and low delay in real-time strea...
06/09/2021

EF21: A New, Simpler, Theoretically Better, and Practically Faster Error Feedback

Error feedback (EF), also known as error compensation, is an immensely p...
08/18/2020

Deepcode and Modulo-SK are Designed for Different Settings

We respond to [1] which claimed that "Modulo-SK scheme outperforms Deepc...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.