DeepAI
Log In Sign Up

Objective priors for divergence-based robust estimation

04/29/2020
by   Tomoyuki Nakagawa, et al.
0

Objective priors for outlier-robust Bayesian estimation based on divergences are considered. It is known that the γ-divergence (or type 0 divergence) has attractive properties for robust parameter estimation (Jones et al. (2001), Fujisawa and Eguchi (2008)). This paper puts its focus on the reference and moment matching priors under quasi-posterior distribution based on the γ-divergence. In general, since such objective priors depend on unknown data generating mechanism, we cannot directly use them in the presence of outliers. Under Huber's ε-contamination model, we show that the proposed priors are approximately robust under the condition on the tail of the contamination distribution without assuming any conditions on the contamination ratio. Some simulation studies are also illustrated.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/28/2019

Test for parameter change in the presence of outliers: the density power divergence based approach

This study considers the problem of testing for a parameter change in th...
10/02/2019

Robust Bayesian Regression with Synthetic Posterior

Regression models are fundamental tools in statistics, but they typicall...
05/17/2020

Posterior properties of the Weibull distribution for censored data

The Weibull distribution is one of the most used tools in reliability an...
06/13/2021

Adaptation of the Tuning Parameter in General Bayesian Inference with Robust Divergence

We introduce a methodology for robust Bayesian estimation with robust di...
04/04/2017

Learning Approximately Objective Priors

Informative Bayesian priors are often difficult to elicit, and when this...
04/15/2021

Reference and Probability-Matching Priors for the Parameters of a Univariate Student t-Distribution

In this paper reference and probability-matching priors are derived for ...
10/01/2018

On Theory for BART

Ensemble learning is a statistical paradigm built on the premise that ma...