Loss as the Inconsistency of a Probabilistic Dependency Graph: Choose Your Model, Not Your Loss Function

02/24/2022
by   Oliver E Richardson, et al.
0

In a world blessed with a great diversity of loss functions, we argue that that choice between them is not a matter of taste or pragmatics, but of model. Probabilistic depencency graphs (PDGs) are probabilistic models that come equipped with a measure of "inconsistency". We prove that many standard loss functions arise as the inconsistency of a natural PDG describing the appropriate scenario, and use the same approach to justify a well-known connection between regularizers and priors. We also show that the PDG inconsistency captures a large class of statistical divergences, and detail benefits of thinking of them in this way, including an intuitive visual language for deriving inequalities between them. In variational inference, we find that the ELBO, a somewhat opaque objective for latent variable models, and variants of it arise for free out of uncontroversial modeling assumptions – as do simple graphical proofs of their corresponding bounds. Finally, we observe that inconsistency becomes the log partition function (free energy) in the setting where PDGs are factor graphs.

READ FULL TEXT

page 15

page 17

page 18

page 19

page 20

page 21

page 22

page 24

research
02/23/2023

The Geometry of Mixability

Mixable loss functions are of fundamental importance in the context of p...
research
08/19/2021

Evaluating Multiple Guesses by an Adversary via a Tunable Loss Function

We consider a problem of guessing, wherein an adversary is interested in...
research
01/30/2013

Decision Theoretic Foundations of Graphical Model Selection

This paper describes a decision theoretic formulation of learning the gr...
research
02/25/2019

Matrix denoising for weighted loss functions and heterogeneous signals

We consider the problem of recovering a low-rank matrix from a noisy obs...
research
12/19/2020

Probabilistic Dependency Graphs

We introduce Probabilistic Dependency Graphs (PDGs), a new class of dire...
research
07/25/2023

On the Learning Dynamics of Attention Networks

Attention models are typically learned by optimizing one of three standa...
research
06/13/2023

Realising Synthetic Active Inference Agents, Part I: Epistemic Objectives and Graphical Specification Language

The Free Energy Principle (FEP) is a theoretical framework for describin...

Please sign up or login with your details

Forgot password? Click here to reset