Concatenated Classic and Neural (CCN) Codes: ConcatenatedAE

09/04/2022
by   Onur Günlü, et al.
0

Small neural networks (NNs) used for error correction were shown to improve on classic channel codes and to address channel model changes. We extend the code dimension of any such structure by using the same NN under one-hot encoding multiple times, which are serially-concatenated with an outer classic code. We design NNs with the same network parameters, where each Reed-Solomon codeword symbol is an input to a different NN. Significant improvements in block error probabilities for an additive Gaussian noise channel as compared to the small neural code are illustrated, as well as robustness to channel model changes.

READ FULL TEXT
research
12/20/2022

Concatenated Forward Error Correction with KP4 and Single Parity Check Codes

Concatenated forward error correction is studied based on an outer KP4 R...
research
01/14/2022

Stabilizing Error Correction Codes for Controlling LTI Systems over Erasure Channels

We propose (k,k') stabilizing codes, which is a type of delayless error ...
research
12/19/2019

Polar Codes' Simplicity, Random Codes' Durability

Over any discrete memoryless channel, we build codes such that: for one,...
research
12/22/2021

Stabilizing Error Correction Codes for Control over Erasure Channels

We propose (k,k') stabilizing codes, which is a type of delayless error ...
research
03/25/2022

Adaptive Neural Network-based OFDM Receivers

We propose and examine the idea of continuously adapting state-of-the-ar...
research
04/30/2021

InfoNEAT: Information Theory-based NeuroEvolution of Augmenting Topologies for Side-channel Analysis

Profiled side-channel analysis (SCA) leverages leakage from cryptographi...
research
07/30/2020

PR-NN: RNN-based Detection for Coded Partial-Response Channels

In this paper, we investigate the use of recurrent neural network (RNN)-...

Please sign up or login with your details

Forgot password? Click here to reset