Geometric Losses for Distributional Learning

05/15/2019
by   Arthur Mensch, et al.
11

Building upon recent advances in entropy-regularized optimal transport, and upon Fenchel duality between measures and continuous functions , we propose a generalization of the logistic loss that incorporates a metric or cost between classes. Unlike previous attempts to use optimal transport distances for learning, our loss results in unconstrained convex objective functions, supports infinite (or very large) class spaces, and naturally defines a geometric generalization of the softmax operator. The geometric properties of this loss make it suitable for predicting sparse and singular distributions, for instance supported on curves or hyper-surfaces. We study the theoretical properties of our loss and show-case its effectiveness on two applications: ordinal regression and drawing generation.

READ FULL TEXT

page 4

page 6

page 8

page 17

research
06/29/2022

When Optimal Transport Meets Information Geometry

Information geometry and optimal transport are two distinct geometric fr...
research
09/09/2023

Comparing Morse Complexes Using Optimal Transport: An Experimental Study

Morse complexes and Morse-Smale complexes are topological descriptors po...
research
02/10/2020

Regularized Optimal Transport is Ground Cost Adversarial

Regularizing Wasserstein distances has proved to be the key in the recen...
research
10/18/2018

Interpolating between Optimal Transport and MMD using Sinkhorn Divergences

Comparing probability distributions is a fundamental problem in data sci...
research
11/15/2020

Deep Ordinal Regression using Optimal Transport Loss and Unimodal Output Probabilities

We propose a framework for deep ordinal regression, based on unimodal ou...
research
10/28/2019

Sinkhorn Divergences for Unbalanced Optimal Transport

This paper extends the formulation of Sinkhorn divergences to the unbala...

Please sign up or login with your details

Forgot password? Click here to reset