Complexity and Second Moment of the Mathematical Theory of Communication

07/13/2021
by   Hsin-Po Wang, et al.
0

The performance of an error correcting code is evaluated by its error probability, rate, and en/decoding complexity. The performance of a series of codes is evaluated by, as the block lengths approach infinity, whether their error probabilities decay to zero, whether their rates converge to capacity, and whether their growth in complexities stays under control. Over any discrete memoryless channel, I build codes such that: (1) their error probabilities and rates scale like random codes; and (2) their en/decoding complexities scale like polar codes. Quantitatively, for any constants p,r>0 s.t. p+2r<1, I construct a series of codes with block length N approaching infinity, error probability exp(-N^p), rate N^-r less than the capacity, and en/decoding complexity O(Nlog N) per block. Over any discrete memoryless channel, I also build codes such that: (1) they achieve capacity rapidly; and (2) their en/decoding complexities outperform all known codes over non-BEC channels. Quantitatively, for any constants t,r>0 s.t. 2r<1, I construct a series of codes with block length N approaching infinity, error probability exp(-(log N)^t), rate N^-r less than the capacity, and en/decoding complexity O(Nlog(log N)) per block. The two aforementioned results are built upon two pillars: a versatile framework that generates codes on the basis of channel polarization, and a calculus-probability machinery that evaluates the performances of codes. The framework that generates codes and the machinery that evaluates codes can be extended to many other scenarios in network information theory. To name a few: lossless compression, lossy compression, Slepian-Wolf, Wyner-Ziv, multiple access channel, wiretap channel, and broadcast channel. In each scenario, the adapted notions of error probability and rate approach their limits at the same paces as specified above.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/19/2019

Polar Codes' Simplicity, Random Codes' Durability

Over any discrete memoryless channel, we build codes such that: for one,...
research
05/30/2019

Log-logarithmic Time Pruned Polar Coding

A pruned variant of polar coding is proposed for binary erasure channels...
research
12/19/2017

The Error Probability of Sparse Superposition Codes with Approximate Message Passing Decoding

Sparse superposition codes, or sparse regression codes (SPARCs), are a r...
research
04/15/2020

On Cyclic Polar Codes and The Burst Erasure Performance of Spatially-Coupled LDPC Codes

Polar codes were introduced in 2009 and proven to achieve the symmetric ...
research
10/09/2018

Polar Codes with exponentially small error at finite block length

We show that the entire class of polar codes (up to a natural necessary ...
research
07/12/2020

Denoising as well as the best of any two denoisers

Given two arbitrary sequences of denoisers for block lengths tending to ...
research
11/02/2019

Sparse Regression Codes

Developing computationally-efficient codes that approach the Shannon-the...

Please sign up or login with your details

Forgot password? Click here to reset