The Most Informative Order Statistic and its Application to Image Denoising
We consider the problem of finding the subset of order statistics that contains the most information about a sample of random variables drawn independently from some known parametric distribution. We leverage information-theoretic quantities, such as entropy and mutual information, to quantify the level of informativeness and rigorously characterize the amount of information contained in any subset of the complete collection of order statistics. As an example, we show how these informativeness metrics can be evaluated for a sample of discrete Bernoulli and continuous Uniform random variables. Finally, we unveil how our most informative order statistics framework can be applied to image processing applications. Specifically, we investigate how the proposed measures can be used to choose the coefficients of the L-estimator filter to denoise an image corrupted by random noise. We show that both for discrete (e.g., salt-pepper noise) and continuous (e.g., mixed Gaussian noise) noise distributions, the proposed method is competitive with off-the-shelf filters, such as the median and the total variation filters, as well as with wavelet-based denoising methods.
READ FULL TEXT