Hybrid and Generalized Bayesian Cramér-Rao Inequalities via Information Geometry

04/02/2021 ∙ by Kumar Vijay Mishra, et al. ∙ 0

Information geometry is the study of statistical models from a Riemannian geometric point of view. The Fisher information matrix plays the role of a Riemannian metric in this framework. This tool helps us obtain Cramér-Rao lower bound (CRLB). This chapter summarizes the recent results which extend this framework to more general Cramér-Rao inequalities. We apply Eguchi's theory to a generalized form of Czsiszár f-divergence to obtain a Riemannian metric that, at once, is used to obtain deterministic CRLB, Bayesian CRLB, and their generalizations.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.