## 1 Introduction

While the channel capacity reflects a theoretical upper bound on the achievable information transmission rate in the limit of infinitely many bits, it does not characterise the information transfer of a given encoding routine with finitely many bits. In this note, we characterise the quality of a code (i. e. a given encoding routine) by an upper bound on the expected minimum error probability that can be achieved when using this code. We show that for equientropic channels this upper bound is minimal for codes with maximal marginal entropy. As an instructive example we show for the additive white Gaussian noise (AWGN) channel that random coding—also a capacity achieving code—indeed maximises the marginal entropy in the limit of infinite messages.

## 2 Upper bounding the expected minimum error probability

Consider communication over *noisy memoryless channels*

where the sender node

is a random variable taking discrete values

according to ; the values of the sender bits are determined by the encoder function assigning codewords to messages; noise corruption of the received bits is governed by the conditional distribution as ;^{1}

^{1}1To ease notation we assume for all . The results presented in this manuscript only require a memoryless channel and still hold true if noise corruption is bit-specific, i. e., . and the decoder reconstructs a message from the received bit values or declares an error. The message distribution , the encoder , the channel , and the decoder fully determine the distribution of the receiver node and as such the probability of error .

Thence, for given message distribution and channel the *code*, that is the choice of (and corresponding ), fully determines the behaviour of information
transmission.
The minimum probability of error is attained if choosing the maximum a posteriori (MAP) decoder .
Thus, for any code , the expected minimum error probability is the MAP error .
We characterise the quality of a code by the following Proposition.

###### Proposition 1.

For communication of a message with finite range over a noisy memoryless channel using bits the MAP error can be bounded in terms of the mutual information as

where and are strictly monotonically increasing functions.

###### Proof.

[FM94, Theorem 1] establishes the following relation (notation adapted)

where and are continuous and strictly monotonically increasing, hence invertible, functions (cf. [FM94] for their definitions). Recall and note that is fix for fixed . The inequality follows for and which are strictly monotonically increasing functions in . ∎

That is, codes that result in high result in a low upper bound on the MAP error. In particular, of all codes resulting in the same conditional entropy a code with maximal entropy has the lowest upper bound on the MAP error. The following Propositions simplify this result for equientropic channels and independent additive noise channels: The lowest upper bound on the MAP error is achieved for codes that maximise the entropy of receiver bits and the entropy of sender bits , respectively.

###### Definition 2.

A noisy memoryless channel with for all is an *equientropic channel*.

###### Proposition 3.

For equientropic channels the conditional entropy is independent of the choice of .

###### Proof.

The channel is memoryless such that . For any and

which shows that and hence is independent of the choice of . ∎

###### Definition 4.

A noisy memoryless channel with for mutually independent noise variables that are independent of is an *independent additive noise channel*.
Independent additive noise channels are equientropic channels.

###### Proposition 5.

For independent additive noise channels with noise variables the entropy of the receiver bits only depends on the choice of via the entropy of the sender bits .

In conclusion, optimality of a code for communication over a noisy memoryless channel with message distribution can be characterised by the upper bound on the MAP error that results from this code. The respective bounds for different channels are summarised in Table 1

. Importantly, without knowing specific details about the channel and decoder, maximising entropy turns out to be a sensible heuristic for learning a robust coding routine. Intuitively, high entropy distributed codes are more robust against independent noise.

Channel Type | Bound |
---|---|

noisy memoryless channel | |

equientropic channel | |

independent additive noise channel |

## 3 AWGN random coding example

The AWGN channel is an ubiquitous and well-understood channel model. Here it serves as an instructive example for the concept introduced in the previous section.

The AWGN channel is an independent additive noise channel and described by

where is the channel gain and the noise level. We employ the power constraint that each codeword has to satisfy

and without loss of generality assume such that the received power is . The Shannon-Hartley theorem establishes the channel capacity

Achievability of this upper bound on the rate is commonly proven by random coding, i. e., for any rate the error probability tends to zero as if using random coding.

Here we show that random coding not only achieves the optimal rate but also the lowest upper bound on the MAP error in Proposition 1 since (and the are Gaussian maximising the individual entropies) in the limit .

In random coding the encoder function is defined by a random codebook, i. e., an independent sample of is assigned to each message as codeword . Once a codebook is fixed and we observe samples of the system each receiver bit is a mixture of Gaussians with probability densitiy function (pdf) where

denotes the pdf of the Gaussian distribution

evaluated at . For this setup we prove the following###### Proposition 6.

Using random coding in the AWGN channel with the joint entropy for any number of pairwise different receiver bits . Furthermore, the distribution of each approaches a Gaussian distribution as .

###### Proof.

In random coding the random codebook is generated by drawing each from independent random variables , which then defines the joint pdf

and marginal pdfs

for and . In general .

For all and define the random variables

and

By the law of large numbers

where the first expectation factorises since the are mutually independent. It follows that for all

such that in the limit the pdf indeed factorises. Evaluating the expectation above we find that for each and which concludes the proof. ∎

It is instructive to consider the analogous statement for any pairwise different sender bits . The proof follows analogous arguments and is another illustration of the fact that in independent additive noise channels the bound on the MAP error is fully determined by the entropy of the sender bits .

## 4 Further thoughts

According to the efficient coding hypothesis the brain implements an efficient code for representing sensory input by neuronal spiking

[Bar61]. Observed dependencies between neurons and hence redundancies are sometimes viewed as contradicting the efficient coding hypothesis [Bar61, Sim03]. The results presented in Section 2 clarify, however, that an optimal code should maximise the joint entropy of receiver (or sender) bits. For fixed marginal entropiesthe maximum is indeed achieved if all units are mutually independent. However, since the marginal entropies are not fixed there can in general be configurations that have higher joint entropy while the units are not mutually independent. This also clarifies the intuition expressed in Shannon’s early work that the transmitted signals should approximate white noise to approximate the maximum information rate

[Sha48, Section 25.].## References

- [Bar61] H. Barlow. Possible Principles Underlying the Transformations of Sensory Messages. In W. Rosenblith, editor, Sensory Communication, chapter 13, pages 217–234. MIT Press, 1961.
- [FM94] M. Feder and N. Merhav. Relations between entropy and error probability. Information Theory, IEEE Transactions on, 40(1):259–266, 1994.
- [Sha48] C. E. Shannon. A Mathematical Theory of Communication. Bell System Technical Journal, 27(3):379–423, 1948.
- [Sim03] E. Simoncelli. Vision and the statistics of the visual environment. Current opinion in neurobiology, 13(2):144–149, 2003.

Comments

There are no comments yet.