Guessing Random Additive Noise Decoding of Network Coded Data Transmitted over Burst Error Channels

10/14/2022
by   Ioannis Chatzigeorgiou, et al.
0

We consider a transmitter that encodes data packets using network coding and broadcasts coded packets. A receiver employing network decoding recovers the data packets if a sufficient number of error-free coded packets are gathered. The receiver does not abandon its efforts to recover the data packets if network decoding is unsuccessful; instead, it employs syndrome decoding (SD) in an effort to repair erroneously received coded packets, and then reattempts network decoding. Most decoding techniques, including SD, assume that errors are independently and identically distributed within received coded packets. Motivated by the guessing random additive noise decoding (GRAND) framework, we propose transversal GRAND (T-GRAND): an algorithm that exploits statistical dependence in the occurrence of errors, complements network decoding and recovers all data packets with a higher probability than SD. T-GRAND examines error vectors in order of their likelihood of occurring and altering the transmitted packets. Calculation and sorting of the likelihood values of all error vectors is a simple but computationally expensive process. To reduce the complexity of T-GRAND, we take advantage of the properties of the likelihood function and develop an efficient method, which identifies the most likely error vectors without computing and ordering their likelihoods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset