DeepAI AI Chat
Log In Sign Up

Proper-Composite Loss Functions in Arbitrary Dimensions

02/19/2019
by   Zac Cranko, et al.
0

The study of a machine learning problem is in many ways is difficult to separate from the study of the loss function being used. One avenue of inquiry has been to look at these loss functions in terms of their properties as scoring rules via the proper-composite representation, in which predictions are mapped to probability distributions which are then scored via a scoring rule. However, recent research so far has primarily been concerned with analysing the (typically) finite-dimensional conditional risk problem on the output space, leaving aside the larger total risk minimisation. We generalise a number of these results to an infinite dimensional setting and in doing so we are able to exploit the familial resemblance of density and conditional density estimation to provide a simple characterisation of the canonical link.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/18/2012

The Convexity and Design of Composite Multiclass Losses

We consider composite loss functions for multiclass prediction comprisin...
04/06/2022

A proper scoring rule for minimum information copulas

Multi-dimensional distributions whose marginal distributions are uniform...
01/30/2023

On Second-Order Scoring Rules for Epistemic Uncertainty Quantification

It is well known that accurate probabilistic predictors can be trained t...
08/30/2019

Consistency and Finite Sample Behavior of Binary Class Probability Estimation

In this work we investigate to which extent one can recover class probab...
06/06/2019

Toward a Characterization of Loss Functions for Distribution Learning

In this work we study loss functions for learning and evaluating probabi...
06/08/2020

All your loss are belong to Bayes

Loss functions are a cornerstone of machine learning and the starting po...
02/18/2021

Linear Functions to the Extended Reals

This note investigates functions from ℝ^d to ℝ∪{±∞} that satisfy axioms ...