Quantification of observed prior and likelihood information in parametric Bayesian modeling

11/04/2015
by   Giri Gopalan, et al.
0

Two data-dependent information metrics are developed to quantify the information of the prior and likelihood functions within a parametric Bayesian model, one of which is closely related to the reference priors from Berger, Bernardo, and Sun, and information measure introduced by Lindley. A combination of theoretical, empirical, and computational support provides evidence that these information-theoretic metrics may be useful diagnostic tools when performing a Bayesian analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2018

The empirical likelihood prior applied to bias reduction of general estimating equations

The practice of employing empirical likelihood (EL) components in place ...
research
01/13/2019

On the method of likelihood-induced priors

We demonstrate that the functional form of the likelihood contains a suf...
research
01/31/2023

Bayesian estimation of information-theoretic metrics for sparsely sampled distributions

Estimating the Shannon entropy of a discrete distribution from which we ...
research
03/22/2016

New metrics for learning and inference on sets, ontologies, and functions

We propose new metrics on sets, ontologies, and functions that can be us...
research
05/27/2022

Lifting the Information Ratio: An Information-Theoretic Analysis of Thompson Sampling for Contextual Bandits

We study the Bayesian regret of the renowned Thompson Sampling algorithm...
research
01/01/2016

50+ Metrics for Calendar Mining

In this report we propose 50+ metrics which can be measured by organizat...
research
01/23/2019

A new integrated likelihood for estimating population size in dependent dual-record system

Efficient estimation of population size from dependent dual-record syste...

Please sign up or login with your details

Forgot password? Click here to reset