Neural-Network-Optimized Degree-Specific Weights for LDPC MinSum Decoding

07/09/2021
by   Linfang Wang, et al.
0

Neural Normalized MinSum (N-NMS) decoding delivers better frame error rate (FER) performance on linear block codes than conventional normalized MinSum (NMS) by assigning dynamic multiplicative weights to each check-to-variable message in each iteration. Previous N-NMS efforts have primarily investigated short-length block codes (N < 1000), because the number of N-NMS parameters to be trained is proportional to the number of edges in the parity check matrix and the number of iterations, which imposes am impractical memory requirement when Pytorch or Tensorflow is used for training. This paper provides efficient approaches to training parameters of N-NMS that support N-NMS for longer block lengths. Specifically, this paper introduces a family of neural 2-dimensional normalized (N-2D-NMS) decoders with with various reduced parameter sets and shows how performance varies with the parameter set selected. The N-2D-NMS decoders share weights with respect to check node and/or variable node degree. Simulation results justify this approach, showing that the trained weights of N-NMS have a strong correlation to the check node degree, variable node degree, and iteration number. Further simulation results on the (3096,1032) protograph-based raptor-like (PBRL) code show that N-2D-NMS decoder can achieve the same FER as N-NMS with significantly fewer parameters required. The N-2D-NMS decoder for a (16200,7200) DVBS-2 standard LDPC code shows a lower error floor than belief propagation. Finally, a hybrid decoding structure combining a feedforward structure with a recurrent structure is proposed in this paper. The hybrid structure shows similar decoding performance to full feedforward structure, but requires significantly fewer parameters.

READ FULL TEXT
research
05/17/2023

Generalization Bounds for Neural Belief Propagation Decoders

Machine learning based approaches are being increasingly used for design...
research
11/27/2020

Pruning and Quantizing Neural Belief Propagation Decoders

We consider near maximum-likelihood (ML) decoding of short linear block ...
research
11/13/2022

A Scalable Graph Neural Network Decoder for Short Block Codes

In this work, we propose a novel decoding algorithm for short block code...
research
07/29/2022

Graph Neural Networks for Channel Decoding

In this work, we propose a fully differentiable graph neural network (GN...
research
11/13/2022

A Variable Node Design with Check Node Aware Quantization Leveraging 2-Bit LDPC Decoding

For improving coarsely quantized decoding of LDPC codes, we propose a ch...
research
06/23/2022

Reducing the Error Floor of the Sign-Preserving Min-Sum LDPC Decoder via Message Weighting of Low-Degree Variable Nodes

Some low-complexity LDPC decoders suffer from error floors. We apply ite...
research
06/14/2018

Fast Decoding of Low Density Lattice Codes

Low density lattice codes (LDLC) are a family of lattice codes that can ...

Please sign up or login with your details

Forgot password? Click here to reset