MMSE Bounds Under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution

06/05/2020
by   Michael Fauß, et al.
0

This paper proposes a new family of lower and upper bounds on the minimum mean squared error (MMSE). The key idea is to minimize/maximize the MMSE subject to the constraint that the joint distribution of the input-output statistics lies in a Kullback-Leibler divergence ball centered at some Gaussian reference distribution. Both bounds are tight and are attained by Gaussian distributions whose mean is identical to that of the reference distribution and whose covariance matrix is determined by a scalar parameter that can be obtained by finding the root of a monotonic function. The upper bound corresponds to a minimax optimal estimator and provides performance guarantees under distributional uncertainty. The lower bound provides an alternative to well-known inequalities in estimation theory, such as the Cramér-Rao bound, that is potentially tighter and defined for a larger class of distributions. Examples of applications in signal processing and information theory illustrate the usefulness of the proposed bounds in practice.

READ FULL TEXT

page 6

page 7

page 8

research
04/26/2018

Tight MMSE Bounds for the AGN Channel Under KL Divergence Constraints on the Input Distribution

Tight bounds on the minimum mean square error for the additive Gaussian ...
research
02/20/2019

Tight Bounds on the Weighted Sum of MMSEs with Applications in Distributed Estimation

In this paper, tight upper and lower bounds are derived on the weighted ...
research
06/08/2021

Entropy of the Conditional Expectation under Gaussian Noise

This paper considers an additive Gaussian noise channel with arbitrarily...
research
05/11/2023

MIMO Radar Transmit Signal Optimization for Target Localization Exploiting Prior Information

In this paper, we consider a multiple-input multiple-output (MIMO) radar...
research
05/04/2023

Functional Properties of the Ziv-Zakai bound with Arbitrary Inputs

This paper explores the Ziv-Zakai bound (ZZB), which is a well-known Bay...
research
01/29/2020

A Class of Lower Bounds for Bayesian Risk with a Bregman Loss

A general class of Bayesian lower bounds when the underlying loss functi...
research
12/05/2017

Estimating linear functionals of a sparse family of Poisson means

Assume that we observe a sample of size n composed of p-dimensional sign...

Please sign up or login with your details

Forgot password? Click here to reset