Polynomial Approximations of Conditional Expectations in Scalar Gaussian Channels

02/11/2021
by   Wael Alghamdi, et al.
0

We consider a channel Y=X+N where X is a random variable satisfying 𝔼[|X|]<∞ and N is an independent standard normal random variable. We show that the minimum mean-square error estimator of X from Y, which is given by the conditional expectation 𝔼[X | Y], is a polynomial in Y if and only if it is linear or constant; these two cases correspond to X being Gaussian or a constant, respectively. We also prove that the higher-order derivatives of y ↦𝔼[X | Y=y] are expressible as multivariate polynomials in the functions y ↦𝔼[ ( X - 𝔼[X | Y] )^k | Y = y ] for k∈ℕ. These expressions yield bounds on the 2-norm of the derivatives of the conditional expectation. These bounds imply that, if X has a compactly-supported density that is even and decreasing on the positive half-line, then the error in approximating the conditional expectation 𝔼[X | Y] by polynomials in Y of degree at most n decays faster than any polynomial in n.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro