Lower-bounds on the Bayesian Risk in Estimation Procedures via f-Divergences

02/05/2022
by   Adrien Vandenbroucque, et al.
0

We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bound that includes part of the family of f-Divergences. The results are then applied to specific settings of interest and compared to other notable results in the literature. In particular, we show that the known bounds using Mutual Information can be improved by using, for example, Maximal Leakage, Hellinger divergence, or generalizations of the Hockey-Stick divergence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2022

On Sibson's α-Mutual Information

We explore a family of information measures that stems from Rényi's α-Di...
research
03/22/2023

Lower Bounds on the Bayesian Risk via Information Measures

This paper focuses on parameter estimation and introduces a new method f...
research
01/29/2020

A Class of Lower Bounds for Bayesian Risk with a Bregman Loss

A general class of Bayesian lower bounds when the underlying loss functi...
research
03/01/2023

On Parametric Misspecified Bayesian Cramér-Rao bound: An application to linear Gaussian systems

A lower bound is an important tool for predicting the performance that a...
research
05/10/2021

Gradient-based Bayesian Experimental Design for Implicit Models using Mutual Information Lower Bounds

We introduce a framework for Bayesian experimental design (BED) with imp...
research
06/08/2017

Estimating Mixture Entropy with Pairwise Distances

Mixture distributions arise in many parametric and non-parametric settin...

Please sign up or login with your details

Forgot password? Click here to reset