On Difference Between Two Types of γ-divergence for Regression

05/16/2018
by   Takayuki Kawashima, et al.
0

The γ-divergence is well-known for having strong robustness against heavy contamination. By virtue of this property, many applications via the γ-divergence have been proposed. There are two types of for regression problem, in which the treatments of base measure are different. In this paper, we compare them and pointed out a distinct difference between these two divergences under heterogeneous contamination where the outlier ratio depends on the explanatory variable. One divergence has the strong robustness under heterogeneous contamination. The other does not have in general, but has when the parametric model of the response variable belongs to a location-scale family in which the scale does not depend on the explanatory variables or under homogeneous contamination where the outlier ratio does not depend on the explanatory variable. hung.etal.2017 discussed the strong robustness in a logistic regression model with an additional assumption that the tuning parameter γ is sufficiently large. The results obtained in this paper hold for any parametric model without such an additional assumption.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2021

Empirical phi-divergence test statistics in the logistic regression model

In this paper we apply divergence measures to empirical likelihood appli...
research
04/22/2016

Robust and Sparse Regression via γ-divergence

In high-dimensional data, many sparse regression methods have been propo...
research
04/23/2019

On the Kullback-Leibler divergence between location-scale densities

We show that the Kullback-Leibler divergence between two densities of po...
research
04/03/2019

Robust semiparametric inference for polytomous logistic regression with complex survey design

Analyzing polytomous response from a complex survey scheme, like stratif...
research
02/21/2017

Interpreting Outliers: Localized Logistic Regression for Density Ratio Estimation

We propose an inlier-based outlier detection method capable of both iden...
research
11/07/2020

When Optimizing f-divergence is Robust with Label Noise

We show when maximizing a properly defined f-divergence measure with res...

Please sign up or login with your details

Forgot password? Click here to reset