DeepAI AI Chat
Log In Sign Up

A proper scoring rule for minimum information copulas

by   Yici Chen, et al.
The University of Tokyo

Multi-dimensional distributions whose marginal distributions are uniform are called copulas. Among them, the one that satisfies given constraints on expectation and is closest to the independent distribution in the sense of Kullback-Leibler divergence is called the minimum information copula. The density function of the minimum information copula contains a set of functions called the normalizing functions, which are often difficult to compute. Although a number of proper scoring rules for probability distributions having normalizing constants such as exponential families are proposed, these scores are not applicable to the minimum information copulas due to the normalizing functions. In this paper, we propose the conditional Kullback-Leibler score, which avoids computation of the normalizing functions. The main idea of its construction is to use pairs of observations. We show that the proposed score is strictly proper in the space of copula density functions and therefore the estimator derived from it has asymptotic consistency. Furthermore, the score is convex with respect to the parameters and can be easily optimized by the gradient methods.


page 1

page 2

page 3

page 4


A Proper Scoring Rule for Validation of Competing Risks Models

Scoring rules are used to evaluate the quality of predictions that take ...

Proper-Composite Loss Functions in Arbitrary Dimensions

The study of a machine learning problem is in many ways is difficult to ...

Affine Invariant Divergences associated with Composite Scores and its Applications

In statistical analysis, measuring a score of predictive performance is ...

An Objective Prior from a Scoring Rule

In this paper we introduce a novel objective prior distribution levering...

Proper Scoring Rules for Survival Analysis

Survival analysis is the problem of estimating probability distributions...