A Class of Lower Bounds for Bayesian Risk with a Bregman Loss

01/29/2020
by   Alex Dytso, et al.
0

A general class of Bayesian lower bounds when the underlying loss function is a Bregman divergence is demonstrated. This class can be considered as an extension of the Weinstein-Weiss family of bounds for the mean squared error and relies on finding a variational characterization of Bayesian risk. The approach allows for the derivation of a version of the Cramér-Rao bound that is specific to a given Bregman divergence. The new generalization of the Cramér-Rao bound reduces to the classical one when the loss function is taken to be the Euclidean norm. The effectiveness of the new bound is evaluated in the Poisson noise setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/05/2022

Lower-bounds on the Bayesian Risk in Estimation Procedures via f-Divergences

We consider the problem of parameter estimation in a Bayesian setting an...
research
12/05/2017

Estimating linear functionals of a sparse family of Poisson means

Assume that we observe a sample of size n composed of p-dimensional sign...
research
02/23/2020

On the generalization of bayesian deep nets for multi-class classification

Generalization bounds which assess the difference between the true risk ...
research
10/12/2022

On the Importance of Gradient Norm in PAC-Bayesian Bounds

Generalization bounds which assess the difference between the true risk ...
research
09/02/2020

Properties of f-divergences and f-GAN training

In this technical report we describe some properties of f-divergences an...
research
03/08/2022

Element-wise Estimation Error of Generalized Fused Lasso

The main result of this article is that we obtain an elementwise error b...
research
06/05/2020

MMSE Bounds Under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution

This paper proposes a new family of lower and upper bounds on the minimu...

Please sign up or login with your details

Forgot password? Click here to reset