Optimizing Serially Concatenated Neural Codes with Classical Decoders

12/20/2022
by   Jannis Clausius, et al.
0

For improving short-length codes, we demonstrate that classic decoders can also be used with real-valued, neural encoders, i.e., deep-learning based codeword sequence generators. Here, the classical decoder can be a valuable tool to gain insights into these neural codes and shed light on weaknesses. Specifically, the turbo-autoencoder is a recently developed channel coding scheme where both encoder and decoder are replaced by neural networks. We first show that the limited receptive field of convolutional neural network (CNN)-based codes enables the application of the BCJR algorithm to optimally decode them with feasible computational complexity. These maximum a posteriori (MAP) component decoders then are used to form classical (iterative) turbo decoders for parallel or serially concatenated CNN encoders, offering a close-to-maximum likelihood (ML) decoding of the learned codes. To the best of our knowledge, this is the first time that a classical decoding algorithm is applied to a non-trivial, real-valued neural code. Furthermore, as the BCJR algorithm is fully differentiable, it is possible to train, or fine-tune, the neural encoder in an end-to-end fashion.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2021

ProductAE: Towards Training Larger Channel Codes based on Neural Product Codes

There have been significant research activities in recent years to autom...
research
01/27/2021

Sequential decoding of high-rate Reed-Muller codes

A soft-input sequential decoder for Reed-Muller (RM) codes of length 2^m...
research
01/08/2018

Duality of Channel Encoding and Decoding - Part II: Rate-1 Non-binary Convolutional Codes

This is the second part of a series of papers on a revisit to the bidire...
research
07/27/2021

Iterative Reed-Muller Decoding

Reed-Muller (RM) codes are known for their good maximum likelihood (ML) ...
research
04/29/2021

Serial vs. Parallel Turbo-Autoencoders and Accelerated Training for Learned Channel Codes

Attracted by its scalability towards practical codeword lengths, we revi...
research
09/06/2018

Deep Learning-Based Decoding for Constrained Sequence Codes

Constrained sequence codes have been widely used in modern communication...
research
12/05/2017

State spaces of convolutional codes, codings and encoders

In this paper we give a compact presentation of the theory of abstract s...

Please sign up or login with your details

Forgot password? Click here to reset