Hybrid and Generalized Bayesian Cramér-Rao Inequalities via Information Geometry

04/02/2021
by   Kumar Vijay Mishra, et al.
0

Information geometry is the study of statistical models from a Riemannian geometric point of view. The Fisher information matrix plays the role of a Riemannian metric in this framework. This tool helps us obtain Cramér-Rao lower bound (CRLB). This chapter summarizes the recent results which extend this framework to more general Cramér-Rao inequalities. We apply Eguchi's theory to a generalized form of Czsiszár f-divergence to obtain a Riemannian metric that, at once, is used to obtain deterministic CRLB, Bayesian CRLB, and their generalizations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/15/2018

Information Geometric Approach to Bayesian Lower Error Bounds

Information geometry describes a framework where probability densities c...
research
02/13/2018

Some Information Inequalities for Statistical Inference

In this paper, we first describe the generalized notion of Cramer-Rao lo...
research
04/03/2017

Clustering in Hilbert simplex geometry

Clustering categorical distributions in the probability simplex is a fun...
research
02/11/2020

Generalized Bayesian Cramér-Rao Inequality via Information Geometry of Relative α-Entropy

The relative α-entropy is the Rényi analog of relative entropy and arise...
research
10/28/2020

Generalized Nonlinear and Finsler Geometry for Robotics

Robotics research has found numerous important applications of Riemannia...
research
07/06/2023

Principal subbundles for dimension reduction

In this paper we demonstrate how sub-Riemannian geometry can be used for...
research
12/14/2022

Mechanics of geodesics in Information geometry

In this article we attempt to formulate Riemannian and Randers-Finsler m...

Please sign up or login with your details

Forgot password? Click here to reset